var/home/core/zuul-output/0000755000175000017500000000000015140076126014527 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015140111251015460 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000332070515140111162020250 0ustar corecorerikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB…"mv?_eGbuuțx{w7ݭ7֫e% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrijVu)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄ{g6R/wD_tՄ.F+HP'AE; J j"b~\b$BrW XWz<%fpG"m%6PGEH^*JL֗J)oEv[Ң߃x[䚒}0B fU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ޓmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓI+mj(^>c/"ɭex^k$# $V :]PGszyH(^jJ=䄸-m!AdEږG)շj#v;#y/hbv BO Iߒ {I7!UՆGIl HƗbd#HAF:iI }+2kK:Sov3b:1)'A6@\2X#Ih9N ̢t-mfeF;gUаQ/ .D%ES*;OLRX[vDb:7a}YF30H #iSpʳ]'_'ĕ -׉6tfЮ$zͪO_sYq+q艻*vzh5~Yy;,DiYTP;o./~^.6+zZFD& m@WXe{sa 2tc^XS?irG#^ŲDI'H_Ȯ;RJ&GT.Kwj;of¬zHmmS2ҒN'=zAΈ\b*K ڤUy""&D@iS=3&N+ǵtX^7ǩX"CA⥎å+4@{D/-:u5I꾧fY iʱ= %lHsd6+H~ Δ,&颒$tSL{yєYa$ H>t~q؈xRmkscXQG~gD20zQ*%iQI$!h/Vo^:y1(t˥C"*FFDEMAƚh $ /ɓzwG1Ƙl"oN:*xmS}V<"dH,^)?CpҒ7UΊ,*n.֙J߾?Ϲhӷƀc"@9Fў-Zm1_tH[A$lVE%BDI yȒv $FO[axr Y#%b Hw)j4&hCU_8xS] _N_Z6KhwefӞ@蹃DROo X"%q7<# '9l%w:9^1ee-EKQ'<1=iUNiAp(-I*#iq&CpB.$lٴާt!jU_L~Tb_,֪r>8P_䅱lw1ù=LAЦz38ckʖYz ~kQRL Q rGQ/ȆMC)vg1Xa!&'0Dp\~^=7jv "8O AfI; P|ޓܜ 8qܦzl5tw@,Mڴg$%82h7էoaz32h>`XT>%)pQ}Tgĸ6Coɲ=8f`KݜȆqDDbZ:B#O^?tNGw\Q.pPO @:Cg9dTcxRk&%])ў}VLN]Nbjgg`d]LGϸ.yҵUCL(us6*>B 2K^ sBciۨvtl:J;quӋkKϮ듃ԁ6Y.0O۾'8V%1M@)uIw].5km~Ҷ綝R(mtV3rșjmjJItHڒz>6nOj5~IJ|~!yKڮ2 h 3x}~ے4WYr9Ts] AA$ұ}21;qbUwRK #}u'tLi'^Y&,mCM)eu㠥Ѻ\a}1:V1zMzT}R,IA e<%!vĉq|?mtB|A ?dXuWLGml?*uTC̶V`FVY>ECmDnG+UaKtȃbeb筃kݴO~f^⊈ 8MK?:mM;ߵoz+O~e3݌ƺ(ܸf)*gCQE*pp^~x܃`U'A~E90t~8-2S󹞙nk56s&"mgVKA: X>7QQ-CDC'| #]Y1E-$nP4N0#C'dvܸȯ.vIH"ŐR ;@~y>Kv{) 9AG ćͩ$.!б~N8i"1KФ\L7/,U@.ڮO?mُa ې!rGHw@56DǑq LA!&mYJ*ixz2*{_;IYJXFfQ* 0kA".mݡ"3`Rd1_u6d逖`7xGMf}k/⨼0Κ_pLq7k!dT x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱ?Q]{#\4ZfR-dVaz./f+yGNMGOK?2_~3\z=y}^G$*A! IcuR.o=MZ9zu b#s9@*иrI@*qQN||Ix;I}&ݢ6ɢ}{]x}_o>Mm8S]~(EX{;6q6^9.EPHŽ{pN>`cZV yB 8yݪkIf-8>V#ہll/ؽnA(ȱbAj>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ {6#_Y mnOp6ws "$֍^ICIv4 ںQ8# G"9#oD(@˸ldKUa2,c)ZYUIe:^O5cUBPcՅG$0V(eW#ˢ+yTؔc iUQ*rHZ'0ZŨE7eUo>7wsYr< <$*L͂Bf~j2H`R<**2j@{;V3X:(o$M3M-"uo& ѝN cU׿j1>r,|ҟ.8(AE5EZS?^,e?"'9Q_U@tJʰ񔔭))[ari|Ņ( ܗ ,GҗŕCr)#鱟7|A""/(ўM .X냡k^ڑo4iڀܡBX+ŊQe=c#Y{x|9cz%bYcrMG6c?4*&@lFuOa?ɤj_uDZ,Ps,X/HQIjw#Q~h$T⦮Rzw]I!>%q0 Kׇ$fҌNoml17l?si}'#y*6L{itdh?2$.n{?锚Թ0/L3Ll׮K4f4(Q JF c&q0~ӾoSZY/!OoDD/c=7#O:u-z_"J$T込F2KcY~V@]vwdvQwD4*KXo/oԉs,ZQ=s#0QW!^Xl@%BUu!~R/U\;fpOa)1e\zZj(Q,f3ٞ犈1f9&>oe2s {Htn">eUuP(@^83p"?&@ե_ (N-ikPc?}W7Ý5OfV8s# zY] QP,gW'Nݙ[q,a0lqjJ\[Wev?H=c&b^),o޿C M $ɯdh4Y%"uG3 ,nҼ u@En(t ԃ"#+Fq}x@Lfl Tv{uхq%E_D""ج<hQhjV8$GaJ#~x&78sC.hEE4}<ejtć܈,(`Vavu'RdaԷ.l7vgCph.]eø_ݶq-Fat/#&d+%fX8ʀb|ɫ܁hA` UNq\7Ra-݆V-XjQkY#Q-jv5=uOXxs(Q}Ya=Ӂq.ڏ[8@"9`m=nWE8#J47R{oςR]ik,DZ769ok F&RY"d2SV3DfE{!)C6912NMKP98ۜ(Ņ $8v< ii چ2SG6Z!0B@!osxP}!>Gge{*_yvvOSg3<H#ܺ򨁬tR-pt.Up7e5@a6Ծ*:rͭS}(hI<~jXHNt-kKb*S7%Djm0^'"N\;ۊPঔF]zѕ)5v`yv3[feO60ˍoJZn%VڈY^ʵTˮi{:HXMbiZ[,z@DDD;\W Q oʓF\WMkSmg͍&T(if,BrPDվꆗ"pN04\yT-u|A3Bе癰-)γP5WP(ZvG)7w_mg78X%)ڂKe4*B@ LMEGLpV:sIKc!]?J 3V%V' SpDD #8R$ZRm5:&Nc[ $ ͆"/o33TY>Et9]M]g/ۭj锸lJ;%v-+u.^y"zwFzzkPǺϧ3ꓼrHb j[ܵ&]It*ƈ&nmPr3$XQjGCs"Hu!+ 紷 }x7no\04:V,]-uז؜ۮT(Ţ>l'H.$p.jƷy8A[)qA(~kf2=@!*WPőݏGRz0 (U.3{JВvΚ[!f 9A[f/LSR& HuǍ)^2Gw0!EsHM3}%Ley{h9i8l,l3GI,3Gx%jFmd=nIser'WtS$Ky3T]]YEo1;Zsuʾ"_:r.K%ę8 FkŻ\Www檐]3ݝ>zX6~dyl+lmaWM=8كVSvil10Ȍ/˸Cc'㮹/ ߓљ#78sE6sβ4M\7}]LSF*p$4S.|({:na4j oNf]MǥN}cоyph;1oy*}5J'Pٻo'*F6d*b "hV ]ǧ,ɫZQKY"p~&V5ڤrȗ7n4Xz#X_ZgE~x{t`:;ptx~xã_0צS @a]٤C;pG8.uwkgaVg@1nw@` >y߲i ]I ;vQ{6W,f?PCg)wN] k~u"{-sdF@1p(k ` bqZE ^lilԒ"Z0|g^"\Fr۴#1F@C>'@1~ xq{/b `ۍUbTܚ  \ "Yc=m8pDv2r1܎6v#@#{@a/P 8\+LЭp}v: gx{gE7XYDB!@Қgv S)|Ѯw[ݱܮ$A,nhoAu.@m3!^Oڤe]t+ пmbZtO3pv;q,q*^;S8 NwX8Y #\XW$7'w-EpJ;v < vKK8+ӝYݻ3Q%蒿K nL2b0 ަ6h)x`)0 6VaaphzȠcsR0he-EJN P ;`; { eB݊@}4xAC({^6dS4*OlT!Dy. -X5rv9_ _m''X`vևᛈoŀGgMG 8j]X?͛ǘ4l+r&(HZNaH_&8eE&yG.II^  8c~19k,Kvq ],p4[΀̷eo$/0 e8.]n<7/?>H>Lrz2W=) qoY~F4 ՟^ }şrh}Ṓ. .2xj ,bԷizs>bS'sȋӘ\HA"0%$(z<~oh^āqYq菨 8vz<&P=D4Hq7Ki>fgW-YQgr ,쾅I{2ŘS/<kpo/q=A{i 4%/+_N9\/p x$[qPF(gٙ㺾?*xi4Nx4I~ŻXJT}tbjDž&e6pV :Gy֦kk', Ћ6[[{$˜?Eמ 8-?tGS~;74`G6{scߖJQ!5>,̶VVv+EW|pYa[ht tR`m uD]WT0o+`AR@Ҭ@Ei6i~~!EPq:c+l<DEṷ[g?^YS%fy"ڭNB<[@_̶'_:I'i{^9>D^4ly ;nM>jI/1Vpx.-ˏ7/" AɮabLπxlm)!1־~/%*}v}}FWU .lΨ(Of#07MױS"-)d&Sp'tG~B~O?c9ƍ@?*ElOٝ}R?Ѽ=43b61{Yu!jaYs^j<;jtPs>bFGK=r֏LW,j?lYKu$:"bI7Q^><OZYCMa i%ͅ8[_tu4Dc?zIG0S]*oAׯI2lVPe\u꘭U|/7VcqЎp=Uuh{P~聞o+X*"f@ƈnrޓ$mtTavL,x1 N]T$gJ:1QlmC*x{ fE41hi!f8Npqak|4MLP-k R$IFs ^|IŭtezE,bRMzOC.oz?oeKXɂ4kBG_<9\ᚿ*e~g]r]-/v_e:&q}u BgOJe^=]8lr(T5]d.Q5j\ q|Qir7 wl1F:)>>KjWyj&{Ы(UX!(۾l2o*@C'Y@]'59?}cwrؗӒgK#vZHk"&=:lmpdu>[PusY""DZxt@YfBQ嬂0SLX9DVIȊtj9a_2F% >@u~Wz_T?Hg!F ī7C $MttӇrGPR<3o9T`*1^p|]Hz`'^k|!8 N " R$l>O]pTcqBWP"LP.p:AFʹaVKz6i]i/nW^uӌڡh-*4P! U1e8>8Ζc1rR)OO;plHt'|uG;V&&K1F@IM HBU[B+EHo?̇"{E+ D"R^HhR$Az0*]ȶlRd^F)dDNd&:*IS ;eΊ`, cքkuE t&eD)z Q1\ێY&x. ݠK!g]#-ax Ft8 JL ~ e?+S2(e~y|x삣cI[ 9FTeByMl,ǯK:m ֝UawIُ Xm҂AjV5Z xCd4I>Ep;\\nGl&kG9j ^ VW_ # lXش.N]u*_X/4ExIpx棈i:Aj[/0Y޴&.D)hK%Iv#;X= !A,GV :xK;$hcW-A7ؒ̒7J_G;6j_Xw_T47ڛ!(B/Y%wؑ2Tg'XΥ mYNcԡ%LALcgg"]TcٟI*9uHzgR$mF>xTve \OcA{ iY 0Y==wљfXz!Q:UxQ$ڝHǤ g8F^.1D/ h(\ jx!S|a4ٹ4>1(A*4@ߡƘCS H:RwUqg)Yt6 -t$Dc˄M0ASF #;+wgԍ۶^·naU#Ujgd ˪r 𩛞l̗Svup7y$T[X iJ "A_ftΩgpIz|bo$;6t\L.]cSZYA"Z3eH#VQ$Q]p\dF98'PGUhBCIxws<;vdrP+Ni=xŽ"HiTZzsVɁp@`-%M'?92ȴΪs`8Hc G?,8[ GJN Hg99Rl>|\wh7Dc/>%Bk3P1D4 dGOuQCB(6y6;}^ g emJ0tv 9RF*=pžU̺P n;~pJ*5(Zslͫ :9Hr酝yM*w:lg &]nQ*zjaGhHQu}Xuo&cg')ˬ|}=.>? ~ вBXTZ}"rH6l=KQ#J.o̦MQαGirA?rr1pG;F3_Ԓ&'LDIU{V=vhතR+Ojb* _2x$2D 4u%NAk~ZvQ5}x;cFg]IG'ɯrOn:f!,JWXMM}Qg%Kg^` 3:wrb:1]G.[)k8RcӐP-o?.7אC rSIXuD#;FEc.ةoO ;:ţ[<^Gek4FX5ƴ25 * +5uI\~;3}ƣn6 kIգOBj祀V@-Y1!&sŝw\˱꼨{']-cX޶;鷆xpZ$ ]P"d9Kk򁸋"kl#ƖpIACY<YhEa1K Ɇ-"u\^=vS3OJZtzgA-ET>21s6m (Z 4H“Ӕ^UEļEWw| E:+/hHJqr؛쾺-z2KO]5D:ڼ]p-{"nh$zxZda> gtTmJQ(bmM7\.9)\\t2zpwU܌u HN\XA%tœȭG)IG8b|]p<`pxg{f0>9.~@i)]=uTLRU3{那g:BRtfqpph{F_R3zHzEogD|Z8  e{ő9O]FE!D #ȀA/%4P~>]⸻$fr!K>@0K*S3?MwMf(vAqe9uO[9RO 1̩2eY/8wV~ iv|Zqً) <ԒyJ=ܾtʜՖÍ%-al+ ũ='{p;e{"##^ȹ~٣L&{` @rѯ)'\ ZtXwƾ`1.GWSߡv,!,OKd]`T-Ӊ~pSe ^}rPsDC$5")͕낣cg5b @v7OFEe4VK/ˠkai6CTIF4ZB NaL$v1ݱ\H9&ȚEpuXԈAҀF&aΫ*=^g{8Wl/>A`y58@kAxh9megFPcH#*dUu]]ݍ|`ܞOy)J80*F%^#)'H.Da˼6]k6NIF[1Y] ߞse]4D\޻IurᎯ̴BK!Np&=pNBj\"BJ({ιΩ1;ApHb+ԣ~7%RiSmnϮ;m}lt&ErE]f`ՁyS* -UU;2&tǎ:/1HS!(uswb*---!wY?k=di9íU6&i6u~Ǟ0$f*4Z34~)a'_o7\a~µk+/#. y! }&tIq8qa*iJbxo'f21ԇO@5^eՁ4"f+p (OW+Fkzgh*˞yȮcŰëhYrPYт=`ep $2 In?D'> T<#v N ;_c,Cs{B8-ҙ-#lvjij7*O_MdrNOc PzsdeO_T}ħ. q=]Px']'T`DN=!k5/4_\~D/+csGJJw~2w+~o&P4g3J6g Q# ,*J>T.z thzrea? 6tn]cb>gf Sxܺ/+OKge ^ ^Ŏbsnļ|#yUطZD :;Kt k9kfd<*r;PO[qed9o. -w+@f%쿕Wqrm&Wgӏf|i%xeFq@\DㅟA[g/Kv^\}>oJ쪡}Sr\/Ta}*->$DE~f@'3+1Dku%V $Ѓ#E<aF&ߟw}Yg,Y[ϻ19ȇ~| TR8 ]w[3v帢zZ9wӿ7~fd?s󑒶&sժ-3>A-Zj!$*RKI޵}.pA3Q$yJ[fMrk0"nJilPBi %jЪ&NZŊ#!h S6i 0C@`c$[% ]Ҁk4M% 6IEP?p ɻ q4IyTJHUh{TńJ(E@ltq*1o1&J[̄nW8lS8GM IH G-H!hդi wݫPes95XAkM&IS'KYujܨ!$ y9C# ݢ'8vŲtrQެ(hgРAՋKK7 `EFL?7q޼ctU(6`CJ7&_n|5jO"4e\ *0wHsIM6)mּ,{s;Z`lfjA÷ w%w 0HgqvgMrK+R5^,kq^| "ȕxZLC{bI5ѬɃUE=b|chٖxsB`f5"VR̃@q8 ̅/qZ:K48KV>LЙk}IsXCE }e*y'"T p s?[N>\E =t+퇭bhs9Ļ#g] vmנ>•DHv@"L[pl_jWXێ;ACaʍa=K.(}2ʓixsտ\IMRZj-5\ {ѠMȹߋ8X%ӳ(yXwZ8—.#ǚVwĊcLJ҃3Qkaܗ°=D^ƫ!/9mh,`}Oɂ+kmGߡp5z=j3ڝMUy@xp\%*}k,Ż-{a"B[Ko,(peCÀƭ@_$;}3Q1}X_Zgmv$ 㓥(Jb-GoU8"Nl 1mYa;@! $q}$ҧe]Ӟ.J),o0do:JAd3 d/[KKi RD[l@ip)"LbzaRCi4cڕ%iG$iKZd, j뒡Y 0*:HfH*k=OK5:P, 0L')CʒѴFѴIg,#%)JhnRi Wzp+$ 7z͂D.`+KFӎIF7  =dɭ.&x  QOr(p$"Pca(& pehQ#h~cuI?&mJ$"D% ~ @M6wpf6\Reat AlaBOFI٤>#>y8zILzȡYZ/ɿglrWp9C X@Lb )5"SJZufK尿1ͤ? EZ$ٙHGŸ4M,Ps6wOb&oD$]K}B>S$/釬(QCkuJ>]!`5Pzj諈_cU]^$F ӨvIm61pY(c WW:+K,OCEKXx /W&[HT.]B{8?Q\h$AP4^X"1<G_Mryo|:η“nëwe}|sifu>Tۃo~=jeUjhmW"\\ͦ9Fh2+@.[DXȦiF@ і\xJ\޸ MWkخktW_MD悌v\l'\(%rn<+~V3rg.zn1/ߺ4&\`cގbHt)+ 1/NU>`(Sjuf2 Wg"pī3+ӮάE-ex3+#(Z.^Y1-Hi檞)hf/P5q1}b1v#TrA_OWiɾ$y!Hhp:4!~W[bO߈檖{Fh=)Zlj>b_kǛ]^2-kY.LpDW'DD*xvyqx)e+fʬ&͆ ̯9?[(GRr8e3bIB:pbU*1(5qxu:y3j2|PY=ſ`>x'YRFh ֨5N2[?*dI9QD-]w(+B yN )BD c$m|@0|@ĭ|`T^e} ju!ܦm(W-{ad%Eė='+RKNQVT'E CN ]L{\5GGȊ5Iq۱+N Jm$MY2VF$RcHgqcq31",_ۭu킏+3뽰DkMp iEz=չPX";*Tف맲Hצ c=z)%%peg2dm|j8aw%ꓨPT OE7Wj ܞ $ ]̦ | d/(‹C+ (SʙOK7(viPڐJ)ǜij2JYF4;s?[A2g'-.M}BTM=K!3'<ҩw*[f4X|ԁ) ѭ;>+ԃ[v|5T(,N<7: icESQZD(bcnC19jGizOFx`j:ؘ_#=^Laݙ)[`7XA: Hź b*Ҳnъ>ۇctޤ+Հ 6d#L)o[U+8#V.3}#)5 CGMSPM]j#[f8w*bJyE=FSDjggJ HeHQ*Mc4JCu)J* >SQ k}H1bʨqU.6Nks(4V*S*"d O̤MMhK] &I3R Q>9m FZD`kc )FI}022RJ3VodeZy)%n`enJ9<{74$*rHhѾj"isVҖ)>bQCO= XQzPg)b*Uj=u"0o 401nq7"%;Q+242B8bX n`2+0D%ƔnF*NEKfK`?QLYeo-VѪ>1=Ld+W;LkLH8'Ե؄3gW{5/K*FQ  S]RHPpBPwV).C}5[#i-C3;?.]X ת= CV2%fgNқ۱AJVGd"JG&0CXrҮhbC1DjKjٮ>(uVWhkDRz=eZp]?Ź6xbNbRK >Ce]GOnQvRdb(8XBgB̠Xin1kY10p3H=z#8{\F\+\r_AKg>7e҇tZRV[ږݮmٸ % Xǎ0߮|Ѯ\X'~C?tfGwe7\Rj_Es|KYPW{@֖TR!ǰO`jW rZM<h%*[yyhk/ mܽ~ ?>~||bMO;*NCek}MX€.zI$ϿO˵ | 4lx䜟 _}pˮ?F|N@GѐѥmWj .DY-j4FbǷ0C t3$SV팚TDrԤN-5u LSάfX)6ڻX&i`%Nb֒4J#*cx%I&W1Ï7BG$GZvG݃iłNkY[B{tP1pe˾S]ԠJ} ^>`Ro"֊KFxiv#fyHUȭzQ>ʰol=`:G'|~2֑0pH9[$sI_F= Zъ9*mbn;r2,zy |O ֏/\+G?hSR,a\33eٰzoðKo'#<>hKTy䍟02:|뷯.7[ H.|^$3/cY޼ϼ<}$I["hHm]4?˂iZ~s`qp ѤTV1gPʼnsyDKQ?os+qFĪLRL5{)HM"\R=>N' o v@Ol yqݍ7QwBiigMdLd 'ߨһ=6bٲwCӊeYAVWF5AI}TiM[i+ PǨ= MȍWp$%FS1kEer~7uz x jY&nI /-/Q<挘TbgoЍT V׊DJ(!D"bjߝeɭ?W#0W֫[ Wf`t1G9 #1ȳsB(M,j@OMد~ OC#c!DHɘRb>H`N,x*' gؼH?x닠dQ[0Ac܀ EyYNσ* "+=E`rg>(s;>GVݠ uI4˷ob|Iܧ?dA?GW rkh<̂er3L02" EBl؆)xkq"XiP%):,֐&䱺jlV*HCY7a5*gJRR٤'%@INK1(EB "LaH6b^{̻e5X+cjͦΚ5D{ː~rNMfies3+¼aY#N1Ÿf!pRB̻|qu**e4EU+y LJq!J䜲Flcȯ#•},,:ʊ%;w 6\+[/iwdSW#*|Yi2?=@_fWB7B+xS¯_nr 4ٯ7If[nMpA{#ynnpcp[lqMFͬR! ɇ?YJKx9In[/a|X ڭ ^jKՇ6Zz/4l3:/Et E0yh`\OB\U ELa[ER+er uy43!\ZyycF?\B2k puUabY̲ѥ_Fe~ 4'ߋ>;`p~#HRemt~)+Io9=CwW{`N<L}1U,kcjma[Y51 MiCǚs"~tZ~סCm`)+KonT i7c΢PQ4tOa$?]KoI+ƞv}bN{؝-hQLR_QE*9I7F7FV1+!j&b8)I M,5r\2FpePy  ]9EG~[#Ʈtm*u3WӠmWO/;;npZkE83֧*Xlͣ(>˫gyg/75}/E{O >+F̺v,SrA?1f׮:pN}U.耬>0^e) qނ*"![OM'lO8_7|̂Z"!YG]̳"B6\&0֔6\\70ƪjd/Gq71̨OM|FFRfp[c-&TBQ~֓?%Nn @poБ.n_}7P5h|DR1ʮH,ZJ9,0g`_n\.Q:~C|u-lPX¿u O=J|eӀztQ OOS|jת$%_%)|IJ$%_/Iɗ||PƦiLx7\G5c>\LRT?N?p&_8%O~S_/yW~SK9Ul׷ScfS;c~Wibb!~"JBllwX-M6|z}. #ZGdXLPfB|ԁDo}d}(& ޾vj?\]聵59}Rk#6}i p Gr6c!Wl6{y=JR(2r3)G2ڐ Ml=٭o -Z3ь^ F'%@#&j[g1@- ǫtX l0i(V4sSVt_tr'dԞ3͙1 F+hZ: <FHl[l򱿅ӳmݍ6ʀٞf<$:I)k"$d1äE paYL2l?eHKa{-53f'yw׃NāvKAͶ鼖cy-nXlGO*-vohEfZTz}VN{{T XfX^i ov#KI(땔*Hg1Ϸ- ]&? KD!Ԃo(D$pF}Hg12Z㖏Mxt !G5[n7!gMZlw/ :_<IJWǒ,Qi5yW3I-qbl۽|~ %%*%,5t^z"6rD9:!͸QBn <>2DJXtsDOB3^ qzEh_5뿮5ݧ?S^!fai{ө[`OrT~7O.ǧ;t@?[Z}OxT/"/a_Լ5eyp`CTC])$U_O#EhQq/UR̓l!RqIh}qy<6PcabGE\ q]Q @oWc[zp.w-,mtLL8Xbx=0>MJRSr$!mZ>tlUV# )`\:rk|6%nϟ|e50b+;9g{6nI=-lo` Qq0=ޕ~n泠6d=qQ;sɇ$W7c L;Z>R;Y]y2bGGW*>^s`RFwYp168YS@co[>hgIMXhmUƼ!DRBYm GPDhc!tZ6wNu&Hq abROW9p^8*kй.riA:ޤGI<"2S3 -qGb-|ȡ0 ̒fpA9R:/k&^34;N@'BF;ЉZ> 39rH%9Qz9؛Fv'WamA +)e&B{~351.*AZT &F- #;RK4fc q@8^gEEio ̂;g)cG1!Z>rI .ީoj|jf[zI>FMnWMp.RdG<`_!Z:Nj(EjXt?~yXUD3|ip[ &H!`bO A- B8R3R4D^4MGy%[̓-lqbrPŮ8J8/T|@YK1m# \ZvFhog򑐫 u+rңfE nQ44*ɂ6xbDxU%[ cn[>dm $ѐJ PK) c"p,<|,n.:dtDr{tl>ȓ\n ӘXcZi d硲V\kyLz@:<1C[1K [`:xIDFHٴ:=:!m:PB>#sbNqT5NDpsv9x":)1u򑨛 c\=[>m00n5GԍH;UamW ɀEr}"fZmY7k ^ ZUi VNG 5r`עkmjT=H-N['H|qDut!GhCZ> (%@wjnBַw8Af[',=]3-rip|}#zk>/݄}OPZ;~Dהȉmj} % #%4SҨ2ā%&gDiG_Y>]+ioO ͺB^Ɋh֣i2W>FqBrmΉpbX:cՐn NY cHpђ £n*zt/2ڗPc!7v.fztT\rgմqAidʚb-@" |3WS0*w_6z@}%5W%z©S$r)D\߃s2fXȥ4A;6hlžXsQU睹֚[6aF7XnH؊3,sj(W\.όrTU|YUݒ[ń} Wqy ijTx͗G\mQ3zm+fU e=C*t׮ ]|nt+.c3U KV VW 'Bd(J$Ί GEݎɄ@D^avL |,rC8MV%ޱ7XW&)=ҝ2^u, C=>ĸiVnoÕ<'4Z1ak3hUXi3o}BQ@+'Ӈl齦\Rj@ ZL`7Y.€ TXdL*qϨp5%] Q(v |,ض+"8ӯ)ۚE%mu[cꖚ0Z'xv7AJW,q !fEL8Lo2"7cў5X>FLQ/LJ TniRƱFdʮ3C[mRa'rC ~khtzmɭZߧ9a |@SZ7fHƳ{#fN0^G5=`nzt|'?z}3(FpœazE3vzjvDL]'Pܸ/>bl|e@rGGō1t61y>ڣ8Uոtې GXx)f!G09GGŭO6d3kt9 ?pVUf Ωȋ _cAcQQnl=:ٷ+3k/;Ę :mJD& +B%R쨬<8ChC- # 6c] jn¬sLzbRapogj1S]ePS3+r;(5E(z^ʣT(GG!G|yv͘M%*Za"< Tq"mZF}a&S_ÄaB- lTX_MbNNZ4q$~\?,NUT 1r{{x+fg9#hH/WDpJ9]xGƯRzLYsE|/0Η-[0gP,A bW8 3G,`<¹*3[1o/ *jX'R=t"SI^;)) &y q lL:jX\'lOzF Վ#Ua#nghUmHQS'z8̄6)ڮ6):3a{RTݓVy[ A;T#G3*S|2dg|_͐΁ U$Y |c!ٸӑG6c!W?M?@4Ep"YDKi4x!BF;^Z>r3Wq5]z J9;> PO8Va)X{ OZ[=u"nboD] (zzM7'U(y`kɜltqIZ>T9_.~/*ñGG "3tnm)3 $W]"6+}@1QSr)& Za?{W`p %wId&vfJlrwGR3-RwLcĢ!Udj閟Lu'*(TcӍai yIm|֧zTՇ ΃P/<β=XfV]2`2a ^8F텾t Ҽ^o9“6m($\ɐ$ʑR{2ExơO-uihChޡmӿvLΘC. $6+;ed1(=$*f&BmEPCTTF{% p;Ρ8 rYZTA׍qڶ/A{*Tu+Tӫ1bf⢇{Z<#LΎ:C%Aiِ<;'z! UDzWDT"ݞLDM//םԫ*~`Uٵ:$$ uR$V.rlNXvOXP"P ;btSGȅc+TŐnӭlZP97EM'(uˢڢ8gnS67܎)"lAcpg.!Uq$mHo4/yQ.dOlY`^ڑsS>ApiSËC%ʆF=8IPpFix{oE~h|t@*WT ;$>OE0 /{lg8y!8.$ CIX=_za!?Q{P_VXʴDG|USO9!ᶎJG_(vK ?&†FC}x?(¾O%c+-Yq&%mfPl݋U&^׻Cʷe){N}DQ#AVk#c`)M\L\0>>nK6٬yH#0߿+G>;![34̘{߲,ǯH&灁-E!8r)ed&`0S% {Eh'Rq*f0bҨxJw3( 4/R-q4RG֔A!4PSBf߭NE&{3_ԏ I(C 6)]`JA$T9̿ i: }iΦe* M0.(zÁHd%>UCUd"[ޥCvDC  F d zR܃z?w/=N WnOԘuG֪_Ʋ a$C3߉r Sq^ J w2觀*%eȒw<҇`~vԽv@+ueL-R?z < j4Z( uD8_*ru}gcUW0VB%'k>LGS x)tYF8ͰNjmϋwJ͟gVL`"7}~I}ZPQ-~zB GklCݸTJ7VeS]+∢Ro 'L3њz(_(hݍQ˾8'[' IhDk~ (b,u7FA49D 1f3pĉc*B| >|=}؅b#)0|ySL`tZ:!b3 A{9O&R/ˆGvzQ4><3ZS=cպ -6 պޗ_r̸J?: Au<7GoMXSE1hk}lE75Ćb "m]A +ذ6m$jS#DH7C=FO5d0<P6dym {Mb|aI Z[G=S F F1|(SЧ-O~cil\P[e[ZiWS>qJϞAáE<ѡ%9`N2QD- ( H mkL f()n[{:V*cinҭhCفQfuCeD_Ov8sK} 0i C5a됬4D횆!«{:7FV3 U:uݪqE>BawsK2!.򥯳ۦ6^~./A{ӃQ& *[I"?H:ėa9Qs?WSڭg 1֟1J3;GVʸ?0'7'xoTF69Y ԍẟ_pE:QϽQј MmGcz+ZEE1NLcP>.V\fe夥IrXѮjDA ֯+yJ9gyeMïEu7Ƙm\;QUua/fg+,=As:!l$tCAd )һY%ز[&6 C<كQ }u#qBbB#!O qoKs>%hYSDkc5gݍQ4gp.ȉF6nwcDQQUppUpȡW\+AWr'MVjso %GD'!aDҧTyAۅn>>-1ocnP6:'o ݙ;3FuMO[L0@M7ΩWc{?ʣ6n0 f!rap6jMmnaܹ8».8jֆ=H"22ݱeRo"mobkYBⳣu+2+ػ+tkcDy5+ވcNj뷪y#gy72QIP}2;0y׽uqfEEm|׻c,M*w&iui={ P3CC#v;u(D1Mp%)j8C{ C<Q"gOi$IF;⫀`ž.%'糽R}H86C<׋16dAM |-IWP߅ȧ!L% 'o(}6esf?VRq;. fc3]@ɀq=O0QڞbP] |%M>OBkϓ0Ļ|n ę]U|JnB?9XXwƧ wY`r-0 s(v;+Ns[СJ|\$5Lob_KqWV+ེ_],'dixuf`#>$qtܽ݌Xp,mn9Ʊ a, mCDLP]@)oLd!z˩^Ckoa[Nu7Fps)5SR'"BcTr-)$+uWAK>%C۞a4G +0p[C9p̚no??h Նzٿ֝*WU [k]]PyUYKa)U|g5NNUtٵukOWU>k7:{t p:_ۻ?}P gp:ᎠBN,e5*[5)LCʑ21p[ߜ]NT\,a@Hly. ƵqB\[8c^0MgJ]ǹO޽CFXHo+ TΗ136{ }H,~YUf60._tVk}3(NuF]a <_:PzLnm U~G ~ck?N+P*?-T_f+>/L\Ί/ǶVq[7ŧEfjxahkމ"9-A庉py;>4=׊G ً±Rw]ȩjLuP[i㷥xrA_)ha$3eSm*愝g&\#8k >\q 7~Ύr2O:É:Ȓ/gc%p3 $r Sq]xD J÷1MaPpz6(w^}2?^~E v97L,YAf<0sQa¼6~PioKM3ZKt`8{j YJBpNs|1 &넛c5X[ԵC;,LND W{툅<g蒾+$Dk'!Ywch@4p c+bs eVR {SCuo`JD1XF)%]"A,~`'OeE&C o.VعX N[ۏ12\3 u4C/"XjRJOw<(3_[C΃Ruc q1'|1<٪OPy/PQ}G*TQar'N>[?;QunVQsv;2¶<K&+sD'2+J7:)$S+9/+:q=#%_#F #Cj1:qQN[Xu&]{qWfaM^Ϗ+M6C49h |2ռg?VT2$N h/"| ȎZ/ٷbA& \Z[Ϲ1u4pap"v=)(s $(&dz#_*l: qqvaXN:'ݝ_ǎYvLТ)%qt ^oF 06mc!GsIJg K5NqNȹ)//9Q06f(g #{So c|mQ^M1bj-3\NҹDeI!A#!)G\oSvYq+=|}ƧZl|):l%> AGX2{^oB 0:Qs$pT7{O)YgIRLVn5p^tOIrYR+j6?:zrJ~ma {>w)DLȨp셄k^y0 y}TaV߸+*A n_UkءEWơ.=N8q5f៤e|WP3mXы7A73ocNYͮL9imNM]C%v#jj'nW1ifDf5ǶM{F&\ddG$7t/LR |DRf".h)U46ц¿L/L2=hyc9YAG:knꑬoSVG+o%̔W7+ffag4Y^ $sy˘7]{0Wg,8kզGVNL+ TJF)U◧{|3yXF-Akw;>a̾EB5x% &p0RocXGKNj=fLĮ!VV\"9kI fEI<"%$b3 s0 C>`np72kUj]t<3Ts}FߞgN4ˊ ւI תL.%<ì_'E]8 vwnbjɧe^ek_Qѯ/H CzS_Ŵ ,t\>+4{24sξFJ=goҫ`FK&! ;AByL,c<`0|8(>aL*`~&f:PLe"t},q2̺,CF>PcF.6d=qF|w! )jcp^^4 4w5ʔ]}~9A4l~-;`!-˛FuyxsqaQ%S"ks<[_YӫoeT"2+jL2g"JD<M",or:@cu71] p6gH㊟EQMJȮYr%k02v _n.g1ǕoɄ^!pH\Kr]hQ_i7+ҜnŰb VMـN*Uf8J5^"(xzGYD &9;uCعxy=!{k,^8m; t6xbDx'F\YSem\CΡnaj3>"&(`Q\")(J֛b)cxiu71饙#`&>inocXް?&ChIK.)]N,W?Uxn铏F{umQ˝8=.#)^=8ܮ5{TD$)V3$2W=&A^6W"+z6f=!KEo5x -z)Z}jT*Jz^'7>K362˚?W_n˯w'\sq_UNh Nf9%* D`JBh^C%K^<݃E^ނ*c=Ȕҗ1vL=jxAPk=Ț-/ <`+=%h.[D2=ME|<dG'VZ޹t:oɉ;4s&L!9Xızt⬂xE8^.AOH?Ob&9:]n,}+<!O}D_tɉ3ri4ĿjU翗Xm8R?C텧%XBsȾC: ({v:Q#hgDQШIIBp[l(‡MN MT =f|{'MNkM|ipr(7̬Z8swrRڎ]|ZrzypĘ> F9"({eӘ\n#\h^ag㘖8 » x8#gLX;u|CH(&aR^Y `%̙bYjč+jdVѰ&X$>sF}YenceDvRBd5/Lp0˼ \Č1oן(=/2?Psg1?9'QJy*:+PQP|73JG. w!qὊ$+ԨBd 4*XƏ p/i'&i Q%.X*"ZJ VW~11qFnv7pj%It8?$QdN+f1-qO26?.=LᬑͺA^7\aا֟(#7O]h|;`Zʈ"ECobdyէd@MA9Z-mѢH9Ӑ^P?M6M0Xvۘhpڴ9^*mK^%k| lƇFrG#oFc{t734JVG0 enmYn4' ]p@K4#;?_e֚Qfנ 9X-)%7hKn.x1%e4*JΞNf"m DSv}vwԞqȱv8FYoVXQ9G6Ŭ&p53ܲóLv"P{N JeD<~w㜖Q v$|~z]9K],_~Sq <E31r\RY’/=I9l۷8&&v̬B?=8%>;YgԳfOTmGW$J|l]y􊥼Ǵđ4iD`P`=8&&3Am ]9KsQP2͞RBQ>$< ʗowZb Gy[}b$>v׀hA k [ X QY[ @R'&Ё <6ֺf~ ݖyԌOL;B;ۛn> V gAbjfPX cܛB$f{)hssAUǟ 8raǨ-fow3XBRƃ_N.BijSk@jf-e8#lE1r﻽է%dJs<3=LT;~,$`m)%_;PVU޵6r#"epe.  ɶ%ex~[K)e AObWb:Îg4zfN5?›yBgUNM5\ac8y ^|*x>?\F ^m3sxIqsRL=3*25-4zf.q(BOp{@Rp|,!x]9DvQt\niϯ:8l_N%hASo&3k>;))p$Tc)Ei̟NhxQS嵇A *$^t37(bmK:YU<}Ϸ-4zfb=\L|BgtPVX:Gx 9T Vkr(1( 5s#I?t߄B܆/gDXT̅.*9_s5ITsAOJ;.UI; njM$%34nB;(bK;]WOwAIqFOz!\v9*Ěr|qϜstqoJ=3Gst9u.}JltJ@@Y\@-MxX y &@rp16V î3]'fh ^C3o<,#S&s1);x G6t`Fa-^{7{5% ~c5kO<BCZ`,NUM lasr6Z=̴uJ^ow߀8/aݞ}_rn da|27ߌ`~M}nf_+#ƚ ՓW`^X#[쐳"'lqm\鿊Pi^‡E2Hf?Vyq esiڷcm?LC;?֤g~c9Yzآvݮae3I/>?d_a\8llh0.bqIۖ64,fuVn|-_&hAٱ窳P$U׶OK%_ 5 "@- G&#l8=x T99%"8#uSOTt>`~~;Qdqaw),ˬMW;_"BG\γ/vUӮofE]0ndr/[.}u~\poWt ҊCVӡ>U=殛QfzˏޥA Ֆ]NdT,,8ZN&0CFf1B`?lD=lˇ El8ݔxuE?(ή.Aϖw#{Zv"qa4*gRQU9Sz܂o*'g}۴ %^]qpgTC ,^.08yUPpo%PەKi&Q /HYKׯ DZWo[J[%5n[[֫p:cuUZGkgAFj&-CC|L j=RkrnPeI*nUK+m'?xs :@VyZeVVId#=iˤTe}Zul- [%5ns(F!kfoυY PCzp*ܲ.zy(_9D&G!sQHcƄZp{KM'.ŴЇE}\K1VIu R)=L*Uh-޷Kj&=g@>K.Uh&F,LR<$b})<%)ߤOYP5s+iyEi'jhL:Т_P+# %zr<$W7' v')SЫac/dvu (Yqy݄˘!&?OWdΡ861s0[q :Ws)L|6 ECLp +քd; 6Krʜ1^bxkpD@|bL<{!?L: U=*jz!VI[%ڶ ܻ!I=03NrX.5g57&P4nu69eL0Vol[Hge6t76f3(?ڬeVvI[Ojć7#5ooz*ٌ\3%铢Q$U͉}݉}JT9bj$uzȜ<0O yg(WC]+:@1UF A|8!DJ;m*UDI2;ʄNNg4!4e n<ѿ;yz;P9gÝ e#G!gqK]j%ˁVrX]~6{[ޖҸU~8Ñm;,⪲MbJţ)YOt-)F^ bi qI)vlq1sa ry!Ųe|aDG*B{߀æ0am;|[ < eV% -ڳ= {I?EXnˬ1![RV{swoF>u-Z2j$Kh˧p[ٌĵ#oPAas2WмBH EbaеB%(#apTP8TA_`qubߋ%{;6H!c"V()5yέ!93D fV,+&'^1iq_eZ#kߓ~OG@Cȵ 3~+\l_pI/HiP{7$dzƧ~ \g8j\ 9F-S#}lB7/M" 0̼*ObR RXj*x]f*?­;]7Q7#[R6U[ꄲvQtǾ,eiwn7E~1yQԛ`\TPE^1ZL1#80!3d1WTܻLSYRhR 4~Á,FvK ./\*$]O~a d0`DQ,99Np 9c#h"R全gUX:74Ŝ19(#(N'WXXĩg-Z>M&wͮ^,F d$`KK $$$. / / / / / / / / jDw_^lqu xZ½Ŕ5 䂶I.A1B/p:̧wʛ\\Wf[t;Վ4R'|=s5Vfhйwp)WwѵB TkjPKM3x?l%6OA1/@z ף6ZUPܯag՚pvs~C |?Ճpʳtf;2&1;'9rX3܃>91_{i]' Aeii@,=G#P5@^~ig}g7UI___Ҥl޹`?>%Nw|ǫ"n9/ç0 AJr/ K3lεʈTpL9񪄷0,C<d~ U!:0#W9tϜy:s X I̡8#8;kA84?=3Gc:syB+ހFqgO՚F̑d29'l?!90lS tl+5$Y,{f>߁첼A0?oo'sbi0*бҀODh] F̑tk~\l2ơHJsy o{g=]nkC'XzGV?gV0?#\: WGRg`ۘ Lu|d٠S ŝۺreBg0) ΘH*ϱe'͝!l%Nתu8{E"9&FPAc"T+&H*QC#j{wmp:[hs#i 374]ύcNx'*Ρ3`:SPE٬,J mH5钪73_ލ6-Og@_t5Wo~63s (rI$drE`9X"60G,(`?9` e< aNqqRt|Bgt֓{̍Fǐhd y:t-6mw>V[ JpϽy&v3;ѻ<Œi8פzdN?~eZ٢΂ߓA mE񊣽^[w ۮmè5EOގ'j!ʃ-DRM*C&03F2&QȡNܹL+Ǎ=RJ:/ĥD1>%>=:=h3s:8ث쓦U.`EZd%cgVզ~=󡃣x{EVNLfez8yJghqP!DK4Uu#.w% !z6gBgucVW))ij̖=)ACysKrN wO"ku6*`޻mZNc<:_>V ,#;<\!L-Tby{/`LMüg.au>0a_GI~u&a/;h&حuHLgn:~U$'nAz8`ʙ7JL.0"~ ? @?NjۘcX&( Hkƺ.DpCL(w( aRT>gۡu*M >mGE7ҰӠ4 S4;~͇:;!ϒh?ўِGGg~rP='UqWD+~55s4O"lX8janP4Rhڨg:RjsftWTnKYasAΙ5 a98y4IOqdN<W9Vce4?;5m4{8xr"j!;Xl0* 27n4Jkl7NsuX~EkXF-NZPFH&ܿE-4(I(*r-9˝)&lfe]&Ïcs1xkNs J dWk%ku4+QFy_h?QvĨF vOr7/Jcٞ2z*Ё9aDF!'%RdT*N,Y%~|Ph/f3$E7xϖ9pJR-"*~WB:jb"XM;g:N*:lMkMZׄ-Ym|WHNĶm Vn[beŀ6V&a%y`?7nos hB7z8^Lk!h'^U+b&;4RVJCͲ,q/U>5ͪ/P ApW̓AųktjV}L o}{jޣ$NvS TB=n AHJRܤF{:jF׋VJ$'")")")")")Hr$Hr$Hr$Hr$HQGZMj5=cL:X[.W̗;LqΘ.w)eG;c?pFvGx/}T,o#5VrB,#q#QxԼR#[ ("ap) $Qc(,8a q(4*Xֆs8(q%#Lk"Eۑd5᫢ۻxeGM uXՍ`|M6 s5vF8bBX @y!t ;.U ꘦ .xNbd t^u wdYd THቔǤd 3ÂҌ;$ -o@OԮ {[S+/Ʋ\%o nٯ "-2.mKY7M|?A v-GѬ&hf4ƖE B9zQFj<`)R,3"Ȳh4XNb#Wt\|C,|1Ux _E^/+[uE^Ec_T-"[c3Ơi6c؃ξ cLwg222O߫?b cc} u _.]þp*w͊gS7~v۲nZq1\*G:1(boY$7'E |Db ,8BRP_Mg2#w}ϾN[lBrT~pb X]~02ih-eF) ED9$rf=}00XIVU7E招2+y6bە[c>L:N{\0lܱ߃sgGrpɣ0ח& Ev`/cOqYخ@UdvRгy7fk mm->uGJ}~5}wkr%:mjz5T8NՑP\`yPN 3*GQ:)~ȾĊ{o,kqɑ7N9 0;} vj=V&YAyumHρW.Q.(I׌RT$(I5JRT$(yPOhM٠dt bč{03{|M~zC25m˧Qh$gyϝ;^fx~Oa {Ó2'~;l_A )>Y4ATi"گ^ BO+ڨc5J!*rK1*4dXHfq!uJSLRyd-9&u꽧az)t&n%sw(zDm uS0PB(kB`i34tAD|hȡzrVMX3ྫྷSe&c2OV9;مJd.}Tp}z xY'T'j]AV:"nAgTOza\$!~ ;o޾})I~||s{,ʌ/<>\[}o7}4b鈧RY]wɪ|UՌt_w~ aSYM:b)`-;dEZ:O߃|O9&,p}^N {Ugi JyIE^Sa<`gӤg RtQ%|Jv&&`ώr7{86=f0m4:m[YȚwROYkh5N *FlR Vwa*3rmR+:1E5핶N-:U )!M%N Ix( $D% [WB_Lx)u:9!XT$mmRBl:: .0gjfG#lO قnn[ՐN nHwo)xAJV:eyP =+pW3iw!Nr>Ҷkyo G<qR#vu;O-(`Pli 1%]&"n/]w퇬Y@݆΄+{ÈSG@_M=PdHkу/P2wɖVMD7%eQx+]FE˫`0Zdˣ!x+ȍ{_ґ.#gvG^jR amR3z%g=Oxk\:6Wv?pś|V4b0Η8%e\bA63* }eB 7:y-Þ ߍ0%5Pu9 0(9qA1_38v* 6C~!n[i 2 /ZnbVx=N[XVEVч!1'Q Dž4DE0Rj%1*)pĎ(aFS RG:Jvbm9\R+3뽰Dk_TBG"Ci LP4 8eK^5ÐW27_a,a$u(¨d$PɽZ ˂@y(m!QSacߝ%f1R(V-wQqAHRk`th v,H 6e%=}!#g x A@Q#caB5ZqȮ.+I8]Xjlh5"ZBjL3Iړҳ;Ћ٨{OIfaEjaxF}oх|f,;PB^_%?Ts9?#4U@O?.Ii<<1zbV}` p?ȳ*L]4cR"t/gC-"{SbIKtwXfs}L%37.p<ٞfHz)8J/X%Kgɗպ:kzyu_h7 ?e[SJV-E").g/%u@b ,x.nSwaFpg{1Yo%P0 4Gtb `K ]7!i/@x?%ƯSm^&~7(kE~kc17Cb6>TwivL/PAҕۋ"~[M!a|S_ͱo&= -ç-@OyTSLfKytfnvgt-Xle:q| 2I蟂mdBʍ-|ENb}!%>oC&9ej9 [R1:W9C9H8 ]jR M_,=ceRiT&߁۟J|_~Û}D}w??;XqY5w?vaK]]kĚtMӵlr&\HDgͷpMlk.@M߿>~ݧynf5ߓw'?aW#EK9B&fjFx.,jﰌ Rc38TLcϻjޞ,wm-=$)MNPg8f8bJ{!;Īn;Rqxo4r `;oc~zcD_^IfVsU%+m"Y8gwX֠\4~MAG82`j)cXy=2D8-WL SP\X`Tk,oN^9\lp!ʠ!h>)ߞ7\b6MV|po㔩¯K;;~n̏k6M'YppĔ]~d{y\KR5e̳H#Hb9 @eadE$ˬvI2j3 6H881cL13MPre:+i괾4L4z e'eFJCܖZ?Ƅ>)Ѷr,4H&(/{@>THF m ȴfƄa"RB~-b&^i"c繞̳ħ“y#;Pnh!iO#K8VqNdI"0 Q`R L=QPtZ<o_ <4GU)w1Kil,us,JOsκ#-Q(eC@8Yo;ϊnDd-YѬ>Wl bGC^RlrSNZ(x^<M.6, d`D)'{+ˊ>NT-89 ^`6\`zNXVy~P@԰q!*8zhpkAkn]7\ZX7=4Q]Z(ڌ-2m|R䇉9rjF:!Q0vY4a#A2 4cG;نyq יH _¯.\&62 };_nLքh-JR>RL@SNq飶ěU`̙ کtՔ܊$ym:U/RXY$@ FQk %I/P!`V0/2TOGt`ZD"rR5³8[ȅ4*T0u4H&$ڐ &"c}9@Aӄ-me,c'$cZadPg($^2 YobA% 9Ij$f.hB*6-ϒv)iG-a-Ej X¯nk1C&WTOrf]ּ}>?ɾ;2l^{Tql4<03mC?W8l ]tZhDg99MG?Xb$a'Hl]`QVg SXi0#@{@Gqb=X0XoD^rÜQf Yca"ZeK'ZDLLJ9:mGs Ȣ`;X+\$JH!>c"yHtN:kr?޼ǶQnnyE{ ̷䟁ށfHC ڞ9&v4`?pmI gdf"MgwЯx_9 Fg@K+LRR&R5| T" YD{YP+2,8c#N3`j)@x=zk7#CU.GF3OS^J:? ł6k"3ѓ<''DVFךSQ{ ߖ7R uzT Kй >@rukv`kacW䥏ʻ▱d7iX Jf1crHxEL BO*2c}ft [y&ZGK!zƌƁZ 1b"iZ+I?o5pN`S\MzvNbC__׋gZ82rH~p>ov` byH-Iq_̐Rr$H"3lKa?tQayl&Vdo${!dV4qkp^'UJ7q1O 89%zm[s꾭܉gw5[-RCTY=(QVD zS!"_m>u"@tޏYF_*rD+K]Q&LJCP Q#Ż.ߧV~M'>O ;a̮1}&PN9ԷZ>taݱ fWdZ=L : \-Ə0 _t G's;y]Wފ_+)"]]~宨gz؟ 'ᇸ*E~ '(@è%2\{OEvM"nk:ԃK f8s2$ 795CQ G,96rD!ǧkC*~}ྲ0C F7K99ͬ09OkW+gH\x}X{Cz͂|UV>otoI~˙#G{FI *r4yD2(,eZ30{-DMFSDjHƷWWJt%ɯ'[aygdꘌ&۾2$5AZX,_vGEgO> KKB9UaaR"F3v m9o +o]Z&gZJkvOP fh/#Et$LSET]-p=# \HJ{1 A0 ('ւ=rl4 {2 cg$cZadPg($^2 YobA% 9Ij$f.B6-I$m>%%H-]mm2[xqEc u˖C)&n` t 'c^rÜ9Q [cx!ˆc3D[S,JqTQ( Ȣ`;X+\$JKF2T;6xݼt>w30\7Fxrmg`/Pn?zZ :z~2,tt DgXa;vV^q3DEn)9#AvxMԈ~{0i h| $|XWj lm>Ehư!R`kĬP!$g4c[/ 0dMZ !h|CC5,pppl&@ǩߝ_͛<`{osv:qwRm#/XF'UEc/1o 7.V rJeGC7c+0rŎc>{]pej49C_փ;y6Z0$]R6_fxdSsgƋ#oSNV/n~~(Q4|R*oB7ijj@ÛE@=fؚu\l?kn!;x-fR>^f>|"8B='8CoUc ƕ{|+ yK*R!BXYL^jʈhA #(H8HwկI}=`SwS.kdW7j];o?3-i;NcF1ZleF͝QFGR : :*u5NNn8^za Nً8eT"AOY~ʒn],DDhI)`?-wwq*$gߍZӼUц>αި=L,z֤&1zmXH)w;eVk,#bֵK[IICg)+Ò _s\1]"*gb`m= }n#oaW>6P]/BN1p|2e#[ qO  ȥTiDEp:%JbTRw`A#VjԈN/ ap) RԔgB{JC"FA30$ )JD j9EG*Ʌ绒 JD.U;3wӶ ٯtd;y6q磛I:]&mf5Afg_b; * O—bf,'a(!EffO+,\ޕq)}6Ba1B|q̅K_}É ,|5-nZp)b?d.I`.zhVнh-"SBڃ l%%{ftwo:Y}J/%33.yVYRp%X_D]~Ɨo/w@wl-;Ȍ (ڿJ&-V?JOvW˾A" ZdXx ޡ)(]<׋Ez_  85!!?pW ]WtD, j1>5iܟ2) EF-s>&^Tw䯵az`$V XC!Z")M)3rS_w VA)c, F1%j~˹3E'rp=2kƿ]2UeA〝*XF> ZmMG=uZli-> 4}5gQajjxZF,FiQ ̃S[omč(tA5FTWכ0OsSZO~(T}c05j5&qlBרՅ>NF@ Jmh˝g @4FBB Psn)1H)h4G<)ٳ=c-9aF`af"l VT ń(HҝX2pELT5SDcDLDK9 Q uf@'Φg]!os2f+引p3ꃗo;OŜ|w(e ˜sa )aJʴ" jZۉVn͙TƨfS^f+6~e\6PlYܢ 9SLWb4.c(Mm䕐^eok+T*je_qzhdl6]z`P}gz}EŅL z#-}TVf%땬^J![C4K$*}2W(+Nm1(կVn|"J\:+]sR iεŎR $,wap`CX'(u24DBT,OyW]]eE]GaA˭VvlGWY 2X~n̕JEgjRFΟSm~dn&\B*ȹ`1Xݍpߴ# $g/LeٞeNWbnǓ5-}6N>M_& si컮u}l/n3˯W U ,ꀗW3^S:8{߆T\ j2:E]]w]n!SJ$Eib9F!KEZNa2@JpQΟ15\yqXBbӬ:LEǸo'q~{b : 2(m64*_ u)Ѕ]- B^4)+U俋9/SVӬ$IJ󒓮?V) ݋"GD][o[Ir+fm5 ؃d f%&, )c>$CdMJx,_׽OuM9o. -WTQܒo_~IZI W0B{d̑>ҫ=Ko }Fҿ!^« /xnɯxKbg=Į^L@Dt I)s9.CL|E_iќZ sb]B Fߋ.Ò1"FE,J17BQӂS,^h  W<Ӧ9xpIxHD(=,93p~P$z|qxkw6FnkZ{ePU R2 )Hkb?B xc (Zm 7"x+jZ̶"c$:mB2覝 (/8b*B15ibyDrdl(k^%I3/KG23Ɲ -+m1xAO49Yˈr> ,6VϢ1K)j8+oHJh!WdH0'a-9/HQ8k0'2*(o+Ph2` .IBiʪQģZLI,e~ efp]9{EYAVLR!*o&/ l &!}ćӚ AXPh2ۚe\vNH3 hRIK@+^ -:10O24 ߤ * SВa} kt*w1$͘AŗC F1U -sKwguo6E?_oBrXWFLʩGcj WZ& XǬ.F#^Qwi/Q1rS  y9IXI 1I!ZWoD#Г hQq4$ E@;&t |B (91p3()Y`8Nʴ'^PhӚ8[^K3g(Æ*K !Z} Y%111g\\U?B lkUf=4XU ܣ= ~ hLG4EKOrbr\ -Ы5(1o<Ge)4FaLd Mcی|Tq ;Udef -+LZH+ -RL,Z> 3tD 3PG-f1>J8B F7c@\iL4ˢr,eds&-7) q⼤6Yʵ{C(L4|PT@$9iId5r ey횏\8muIP|$ 0H<2pyEM%1G%s^JV7B \6?T"JiyAD s U%Bxv5hs0-LF-sґP!7̓dfRd͈T[)bsSZ!Z磻#` lI0ǜ̃sХ~u>[k$@ t!J2j KA%2TPhūu>@5% R&9H,E3.}ݷ!P7+zr7t1z2z:[bEZ $/$dʂY"b&Ƭ4w'gEb@K On}mV?}mJ/d:ǐ;F˾"vg|~?ӛ9t&1M犵Leo4\*8 |ۜʭ3U¡@] hcN\K bxdpnDSO]&[/1J{=tzQq׸/iJ F9@nB w7/ /ᜈ6 6HoB(iVʗ >%Y2>pg` {o?^*D{vH_oNЋaevL}8h·_h~f^(S <Ћ7!h;׸Id͗{vdSDRi!qtIG<`{٬/#]Α:pQhi#P5Ri90 Gme'p#cMG,}ԏB Z\gt>vffN1^陣|E&~kȉb=88/bj +@Ɵ>oE%ʜ ~TvooK-Kۢܖth7PK 7e2[fA 7K>(zGTf^@JFD.iwA$RO*_RjvK]Yː,YhBjl[a< kxU+4AGmSV|*n{ҧd8h DnFe)Y]ScP>a~Xj5_7fSDG1U?O}r S 7^}q^)sDI#ǂ)sC#Z/}Rcd:.WRƘ-q<1<2 Ug72MȬmaOV7C9<~|A/+kkO'@Бϧ}sLJ+_pr݆zGO/);G(PGsڧ+͝w_"}.u [9]ͮ#}^:vHN2e% u~|0~:DgA27Fګ$%l̤aTrHYT@b:){li.KBu'`ɵ,3!d&@85>lp!{i(޲vRH,]O4y霯BH`Ey4=]ÔǮyqb⦳R^\[nFm'%knZM֯vw>Ƴ%+7S]|A\fofz.6VH/^KG7aiR:eJdX'= (0Iu)z_ZNcmD4}G1+<W;sm[:^uyR|JӈǕ1ըqu^+ͧ֠l^z?FٲޜU 32 b&eY@Z 1+T%Qjj8jg3%S g' Vu߹~ԯCr_ﬞn_Ѳw7[Tle5sf/kYC +Y)Cgx$˫iH^-NqE=ɥGMץm[ظӋ8>[Wd֎ϖye7Odj]kd]+pZ^mKt);'E{vIeEז TB7$se޹, {v_mk{P"ZD'bz Ԯv;!V('EoN𕸣R-wơ#~k{_mnrq7UQfLQ&3I["e$y_(B^9~Ⱥj^,:ʮ9{wksB>~;rYcS )5 J] m oTʹ,pgT$70@yj:&j!Rb([;B%\W`h4c-g`齜?F`G*wLkP[aY0a!JhDHG0TOᲴ?t:ĂYU 4 yCOicpH"~Ɓ E 􀨑1x0[8Gt ig>wA8>Bv "gZ°Uey%?Ua9M}4b1^ik}D|S"|хp7xک&D@JH5wSf68]Իemc㺛30gX0Q߫Wތݿ&.lVF0p-"{ӆjڃ 6&lqjmFLo 2[3hXFoWn*[ p1C3..T nV>~+y%l#Eiꑝa^|Db "x.SwfpmG_Φ3XHg@0iC(X]X.*l.ȳ^NB݇(}HV"xd?Tҟ<Eǿ.~kJ.!I7<{Ƌazƀ n@ a#2y_lngWiU!#x>,1+IFCrLu#%V:x!]2i/|70Wkۨ67zvڻ Ã`]:C-^j_v_Ϥ}d1n9hK>ݷiu 6r̀izP7O5=X=xivĖV|q4iKZ2o-; _B-#ko6u0}Zi ;PDqeї1ٮ{K8 I'*Pi03u}ݎ;bGB'L_u7q,s~x?߽?~Û}D}x?o? Y ̪ڦM@?a۟Zu14*vв˷q1[ƽ>V[kk@tMN]%vQ+;4ժ9 ô͐5x~6uSâ=\J !B&fKe#;,#0o@{"Ō9UPc`3Fؼ sQTחER;wdc#h"+t"ȻJš)Yde>D|;;5?쥣~/^YvNSZ%49v25uSH^('neOO`Cl *Rb S[R|۰}@'5nK|=έk%g:lBވLdR[k~D,HbZFNNaRol<0 ld$#E'Qy&G WД|ʻ=ҺKyeNc9R`ǰ{Pd©ha}Ŕ0% eZȅAh-vstwoSm@k6yߴ1xlE[fHMZ \jRYVXha6(SX8S-.{;>-/,E]GKk1##$VKt^.]_* +H5`3 w0WJ"j}]_?Z ?^ jAmvC+ZB(TՀ}{G\$n76jaCaYH,XTQˮO 5`O?9I3˞^[Vl} '4vKO5Ey`xI@*,J,>m0ђ,> ٻ0f03jpq}g˒)]^~R?IOKdha\h⻭Z{Juf=1qO.0.Hdo=\`\RZӜ &cͨ KE$"_$Y+R 3+Wy3bm7a9-fW+e!1,`W0daEt\gek2a Zʺ/aapr太 & LRnJ7+QX qCUE:EzY>.N Tl|~ڐR4f4m [0 x$ )[X1c豉hj ix0o's'EQwPR_N|I&( Q3' qfAS. ˽)ׅ.c'-&_xHB|Isy:ٞRt! -<_י %ZW>RL@SNq飶ěU`̙kYv]wݦOLįf4ߍY Fbq'ZK(Iv yI"R:" f;WWa iU3bP A|@N@ւ=rl4 Œu,+10BB23uS/IN Y$53Gc]X˦eִiYԴְ"wU>L^$}:Tn^"&&VQR1qV{tHbaJse7Y%$P3ܾ&})\{[k_JlͤV vA=#y-Aze^ze^zQ'3'|2:'茴z@ZL!N|2:'|2:'|2:g,|2:ڕOFdty'e'eM˚OFdt>B#\ }s_N+H0,(APGCTb\0bTh'iGa_=#ϳ}~ƹCˏg &b & KF3OS^#f!&9KlGha\ɜ8꩛cB)7j<[0, Ƹ#aREbD h$%Bkz lXrvJNVY뉝$N`[6ëGE繱~ w\muM|+sH 11E֤я_靈.`qN&XG" /s ~ 93Y2#[i#=ЪK힮+]wΩMc7&]^?Lp^cQ@ZWZu9Nmv{,MͪChY@Enw^ϫ;Obfh[&Ow2X!2P֚z;ӱ7z\V/C㭓sO4h[hSyNF|ӊTYk{vmu*Ɠ6ܕC6H8-8ű`B1 MPrez_U*/&ָf2_YF,B0#BXYL^jʈhA #(H8HwկH}=`SXo&Or~:ή6 ?N9RĎSգy [z4.\v,SXcg#{asN(@БHP@;1M'.Nb{t ސupiBuǗ"ށoFǤJ#Ww\+!A$ϙ=T+ eѰtMƣQV5e=D@%΃i c+, gz8_% Eى \DU"L IّnP8Zl`H&9+9L4DiNWu*f\ ]vGbD C%QR\!EG`9"H:Ekِ5<Õ5|SA(Fv~~X`} .eg"GXͥ^i18')~u )]ara\(`&_gC.'U쬊@U{s{ZmEcL L; )`Ń)"T J1uf2 vpu0yI;w:? .hwدz\kw;7ihi*KӲQߡ]rKG"oZwKWM!{6& t%w=o3?ViǪET^)!}B`𫴛ز?9-1Vv-r#7=6; }}6u6~M z#(wR_Zc39{;] =w?ϥE[aKU1aE ``3F^ڼ{HԿ>(=="O3bqd3CDypbU}7RqMN6&azy[ t+sX-E.Ϡ @wg+ꉳw[r6`/0bQ(X/ƣkX|J_]#T/hQ L}/K{9T$&1ܣ[J73*B{,N! I Zj{lPNHO \"r0.>ќT4mϻoCGl:-,Zѝ4ڬR*{6W6wwA"gTTHt,AL)0"`qdy NNskə7"%l1"`ci4%xeDA:Dc`=gǞ "k/%48xd胉bi1g;Jaظ; |x? ?M6sFJ&w>1!nOo3^a{N01 {©ha~Ŕ0% eZȅEh-vb5N(Tɖ ˗HKx;ͫG:-4s|>jLR.JL+2r^dfʉO*}K<]Vdˋ"4nPF/U6~%q%7bh-&Wxkߝ3cG63v@\= Sbv7}6^|{˲dyvso[Mು|簝>=alpwL D{Ä]>]V;[̆ sme-ŮٰQ.t}[u垒[ gOwW_YM_Vdӗ = uWL=QbR Ku!tJv*CwL)1ER80W;mnqk*-M֩]sfn'],7&yI.4ViDIX[Þ)z&qq(PPm+[hWr~:Six`,(U2xi23ŃT|4q*֢@# "(ψssXb$a=0SN;=]ґ=1땆68R$#?g)Oc.*51VَOu֝R;k@}V<-!҅BR߹ob=_#qrT{CGoN bP9@14 &cŠ I^ԖR2|J 3<mFG/0g 򳻕݋7Uep_B-f,4SiTb-NK^UŪ*e8*[uN݌#N-Otv wTdJ0VRbǭzQͬ"O.O?WrTSa\6gG洀a*Z2li'YYbB}jrjK}9+QR#Ya#f!]Dw(s`6Bk{GÒ|P9Jǒ)K.5QB!Iq*kD!45"x\`†DˣE/L`lИ{gF U= ;"VGPYeBO20a #j/*L ٟ/(J/r4/X".B# ' <0a @[:4ea=ZGh-/՜@s-bO+ytIܥ1L. ̤kzȥ=E?_uHֻGX-&SkxVH)+L9@R$w`&pnW)80eY ͊N[lBtrTasw5E+Hj^Ң0$0y;20/US*.LHj)d7y+T%N8h:uv lmhP蒢ߗ0Wƽeȏʘ꫋)/pO^Wo/ Tyg?aOfz!ۥ@W<>3ùpa![zKon鼭 Q,+ Q@Ii& W'6V j3ȶVƊd$m.FRإ[_N"E.F݇U?VT-7n>.\ oOz_~~}xuū&wq u~ LKm]|._vnZ#U^4UleC3Ү5.[s\ʀ(XпO\Y:nHOtZ̚*'Е t|β0TmWJs_n@-.*Ƶnfkk}Ȁ{-r#7=6; }}6u6~N䝀;GJQ&frH~;{QgRȢȭ0Hj*R̘:ќiXx+6/::2O1B2GyF,cQ< xH:ұ[ 'VEwa*KQބd-Y6:o60ҙNFa<26HݱL>[QuF]o5oͷZtʷڀs[ rG+p׿mW!rFBB Ai)1e⟗@63wZ[ T{KR WRrLjk`Ώ`ci`ZLBˈt))L,G0 |xt\3M"9/[hY\|й+D69 SK; S”$iE"ۉ;Z @YXz+(v |7sN^b}.vb7:;#z(Fb$b}.vbA*.?< Il& &uD%ԖhQjc&zGG}8j"ͿªǕQVFY7jAro t-Oi=ҒH19:LJ?Y *i8)g+e!kkj+IE/p.Y7G݈qxu-B%fI I(Aɬ2++?92+*KB|U^ ٰ9s+{Q89g"AgqUvheg^jkG?~99W1HL451sD (R69kE.c5㵳s*D/̹% - DsP򚌏"i 2O}"hwN6q{) pxΏu=q2YYF:]h& V=U !9G:$NVkB2N AR-lB-lE^PcwQkAJΐ^ 0:z.|BI <XC#w&@)Q P)1ZJe;Q%P6nޓ~g7fS}lI7t3?Kt6 ;3ZqZ:Xte`_~ߴ%xWSxJa7.XR`y$ό[IWCg}u5|B`Z ƩًG}5!y1cQ.v4{5J2l%+y XP‚7bZ ,q'/WxAVLl_2پ3ee ;%I p%D RD 17JY!ʌb#K##O [I@gP IQ9SR&Zw6éU2e>qHk^P2ƛ֤|螡܁c 9DMhl[>?:_HfeΫw\[:@Ԓ*hA* \VSZsJ.W@{.5!r9P}!LX@=%`@+,pIQ8: GG(ii)AwNS8:;Rt"BQ8 G!xDs)Y!GȁKJU+krSK[ q1(F\eEN ~TLVo%nM) mJM))e(lM):Vt9ڔiE mJM))6M[h+)%ЦڔBRhS mJM))- ڔ`l$;v(Rig dԑ 4eɉ )H!)Hʈ rҐVvFBrA}kPzJ:Đu&焊ZFx.X`nUwƱr UYq K91(/pR_1S_pR_ K~)/b1py*/^kx㙑iO) MNIL9פ,Dn[;u/06?ͧcq;ЂxleM*"uF-uHZ:BAK@^q)»V_Bv}^}:e|nw!zgQ~1~_/:L|jpL Lfn}d*Um,K | gr3̭eEoPK䫘 jR T%4e%*G9K |Qe9m|3BϷpru %jaeoִ54S^^h ,YpdY mЫ+VQ=+ŽcY^ܘDzݫm'o:-mu;8;^߾-mqYV1e;Un^V,]X$j,L_Ykh|!w}qoY,rl؎G: rwܥi,aq"f2,F&b2 c~2nVţe]4YNN z*lM&BŁPԨ8Ny[r}zy<&"fy8~A((9*1I<` N1J vol9pw L 8:nds ~:%K!? . \Cg2#OsmTE5pn*G5s9pNշ`w :9+xӲtGQ~Ǝ^` !;Ve'A^R$>q'4\*]NV&Ꙗќ=SF aUNq*jcAwxpV[8Qɻ M-Cֺelх~ := :g7S=&ܮ͏#wײPʭx45[ A:fB0D/TG)MbJx {0<Sl9$]r.Хu8m1œsV8`!i(UѢ"DEr|{>{GPۈ?Q&o\ZIQ AFôy"$\t_S[iִ>gFZV >yd!D͑QCY D4(IR/"HOqsܜL>݁5qD)-LЎ\veYwvSF\I:?Y ee]Y|}0ο'e=%ԿPwׇݛ,wY2&k[{NFc ;Z<vrENΔrbGxzNFc\/ObTgqZ䍣E@'P(Wwpr '~ٗ,E?ݧT~wӎm;:W^H3i}Jtp<_;[&g?#'>'(5縅{(UKϝ1MKw&ųu<~z|><8[ɵ̥^|2휣Epnwۋ`n-XeS3O!3g1~lGmv/uͭ2 Zm+zOnuLGrU_pJ?'3>,cJ\SDsRY?yݫsǿۧ/?/_ eˇ g`}&n?t5*fЕ~ٯ`Q/kk9iu+-P/!1Ӏ@_c>{̩-yc&>D>2)SW ~1 t0C&B4<hͅWo߱6 Bi JI LqgV\^ aF N +Fݾ#U1zH<79@Pj DQ>hynJ ,J!T+PG՚L<1v)WsuLN㭖V`oux+#.ӕV7y4VUwlIuz_/z_oz_7z_AVh!9`(,0j-\AT J$*O6%\py\W Y ⼔W—]oW: *Qz:xPKrZ1F$CAu򶢼SJP 7D:ܛ^ 6u/ b3-akTb'́yA%WBS#@hVN-85p%#"Bp#!Jb( ?@o# ֔z.hji.*@KzgLD?5A.~EBX|g53մZ4&J_ЍP@pV9v2A$j(R?u ]g"GZ_nFO݄`Dchc"´k^X|a#8҆**R=PbW-bu Wwg}-oF6SyƯ.w@ugqiU 0s'%aG%þߊeK><&(hW/AE#X&NKxv$\=?!w7")\w:|`!{lES$0_xȄ&嵐?Bg"J¡sxD~̙ Z*J72=ӯx]3aeۮvȺ31hsh8 z;]rߓy=7}]^M=t?5u^u:9@0"x.Ls4vyɨ?0#}H$Dm*!U.9Eڝʶza>]<$uٵF™27ʖ.x/sN]f6o]>k:/ #{(> gs3ѦLfd&8ńLfb|[+fTR:Z)ǐ:bH_D i#c=hҬSzq&ZGK!zvjg>LLj9+j$+=0^@ima/:_W_/`d.-^V]h)B[9X}@i7dKKp4P0r#eJTn~yJIͿi:s,;eUkm:t*#C_u8-Nn`:uk54' _V ++ok_eRLWe!1,`W0daEt\w(8 9z>hu,Ơ,Q\z)^ Oʆ6uy{y;д4NGU\zfxuhLG eZ30{-AGMFS`[HKģvS8MPsI>I_kmk%Ա 3q etIycI Ejm0S.o6&" Q3'  7P,roD-9E4\\䎓]5d\wd=w,;ȾՆ}3@T__Mw%ե]:#[~t4X!9O̴Yxd>ZRPG hS\-&x#32x5sf"¿SIWKS6Kە ëx3Rn^Aaїt%F&oaRcEuxQX8s-Nl!CĿX0.`ȯ3DZx sZD)Z0laŋ ,"-*DyJ*yY)R#Ya#t!]pCdxRA:1;Y(QmGH\tdXDcZ)"VD h]z29Ӻ:J{n5Niċ pt7Ri}켋*)m69Q\' 2"㯠3@YP+ (,Ÿ`#N)wP[G7 3WSђr=21qT05CI}^(a\ɜ ̂AC+TLlKaI<5⓯|2o-J*;[!'uu! ˎ|/iE$+vӅ B1ҹKrFL͈(,0aP8":Z@S%=Z /v< _=-L 6;py/Jd>i5 m:IM. ńwuEz{ɎNm{+ݾeвn[=/jԢ祖!eGmot{RŜ|;ѱ7f4, OmiɱŚϚn|Com?gӁĔ[盟7HjƸkZ}^k8e8|`#ӂS (/sD ('Qw|w?S]Y#d1`|D`†$]b\`&0A6h=GjQVߝ0bQG"`"RSFDD b FрG!eLD:*~w@`/o5]NoFVh;_>K8)r"=Go,Ym\ Aev )0T6'( F.,V&%"ӷ>o/ _s}h%*ㆡ/Oc6g[GXVN^pbN@.午&*P+yQI#( VjԈ (WK 85`BL1jvfURµokYf7ֽLw2D.Uiv\r<]]6y\V5vF8bh-0PA*0A4xb!CKnp+9߶TxGDJNcR h̰4N+!ʓH&vG'qǒq°ZNXW #F*wTkP[aY0 S4"#(] שرIr?t?ijeGbD C%QR[ p%Gx'HyidJ`yt?ZqBJ%PYួ|B$\jL;"K41h޷}J0Ln ߽.S{pe{ $jy5g2MRtv8is ǃ)"J]$d J" tQ?}e gE-6`MH鱹tHܔrN黏c0,a,>Wo#ִgzܩ{T$*ʾVWjׯ/78T+Un `j]Pow`+Sj 9խ{UuT>x{7?x 6#sbnazKEjßewf8_Aa5Ė>Gi4CT0чQi% ,&GG%hI64W,'Q'#d$ ́}2WK|{$dU՝V5::V^;~?_姏o|>-Xo`zjM$h~0 <}w?ZW5>s ͆Zu-(} >|K涵 J?~П~V7*x\@|@u̖T17Ӥ g'30fc `۔C 3@Ӡy4H)ʝ\T62r+ #Zc138x f?#1\Hc5ўinR:]^| $N^ոA1]]%vo?]> I2 N9N}=v3vY_~{_pv7^MK/*~9(0KNnRnIulI 0w"tRİOsSe; `vy:\; m V'O,[eUmWL.eRsM@{"3Y:.taNisrk[ۯz83(^#e=V[gbs`kDP$uvfWU5՝ANxSTʤbQ2s߲ϬGN^d9IE_agUv&=zPԤ;q8f8ɰ7󾲫C}N ?9%Qg(XŴW&.=K8{@?xcϫ9R&#l:{]u<s0 ?Q]%W\ʖ)B+8 T]s7WeG*וW^R].cTHʶ_cI#Ge9@w׍FwHZD.j[: Sw.;HDu=(UDh QZ-DiD| SG5 `Z 1AxDN!2uF8+ +rk 0?qb5ehԭGe"==ĥV[7=:>0F8sT'l2 !Ar6vE]Xd%pʜۢZ3O=H$6\e}yM!,O1g>47 7oruh~ru5;7Nap~(>Q1o(?\9juK_gS̲{gsozc ਪj ZFƭT@H@"B--*+8]o5yv< "N?!O_￝PLƳр ψҒ'a$G{\"n>qd>f4uFDxڗ/P*8} _N>>8SM_݇F6j65;_ԎˢGůfJ6$ƣx n+\>z3ѣ)U/Ԣ:ȹ/Gi,d|Qr-/; U:ZKJLr;!ǞN(.[͞s.7k<~p9[C;NJyu,ּK\-(O'=q`z3t_={*_={j_={2_}P)stŦ|"0orpBVl- p? An=76#4 L ψO9LITX_B`!mxzv{&ݛ`o@>(]4 Hpc띯Nջ$”{n D^ڙ{ 0A*r9$U"PL./:7rUc' ~OT`-;(a4ݕr0|FKo3d Z㔖ǘ\x y䰳o'x]Eg>-P F@mt0d1Єh,\)Qԉ5t9-N.1 v&dsH?֟#>otM덆cG#Ӂ]b ^ NS RaZjK*{oY1u:a5, hg!Vɹw!~Gm̀\WۚOEw2W.\Gx;C?l?$u!M \:R MB0Vw*uCrTD5A1XmMr=)ZcKnQ'c |' 8iy+_rۦMvEoY>e)!swߐP"!،JJJ365}pTh.z1-*Bۢ",*~@Ȣ*z"xj7AFω{2%|^2Q& )QあARJ,ƲUjOLFS^jhAԑh\aLٔTN}3rCĬC"1/FfBhn'6 -/|4u@y{.6䷶7ݞU Mq.]B{MzhHrg<3>JˤFȧj8Ȓ.ۙ> 82N9"HMVuNcx4.Kx39$!>f(br,",C2KYF-ENYg(g\} ĬiQs']D @@uʝiJR/,IL&`qk,`8ҋ@"_AO?3$pTA)drM:)KKD2D8&L+)8*yCΠ {;{%NƄZADApFh n %QM8H$zMSłTW(*uO<!ݖ:#dMĝ6K(nA* AEKӍl&lZB)D[qo#]MbkkBOE:c _uI{txPF6 ,c PH>:IB 4dJdBxt's06tt|SNЫiX#Go=$e~\6$9 HM)$猁΃M`'v($'拚87ѹ3%>31e9ޅV)8=A5D N*&*X lB(0)i0ze4'o+CXKL(E NpgՎs& eh wEΎ-|~HfYnr M o3tI`!RĔ$$`x ̧(8"sI^J?T+ nY$.gJFp'J{*)MB7Z1- B\d P ;+C4qT9|6. .O!W"W"5@t%n4L 'BLBp}N=? =?ִ>gF8洊 6>9~K[)fpf_5UxvqyRI2Ϸs.o #N_`I sBJ=biLExv3⟕;?Uo7/.;\91`wrs3 "d/W'wG/5=°W{qwOaazm71ѸQ8yN> ۉ96 ͽ\뺱R:ͷjc+#cDq5E?GnXh]VtMy~sdǿ7g?ͫOg~|sF>;ٛ^Yh:؛Gك[]/;wmt5v5ҵr̈́o]~!(o[?[+mէWz0c|=NͮYj8ȎBڏ (ӾioTi)o_JhUҥHˍ62N9/e(?lRZlNpzTfe؞<`T`'lw'gQGODk.2JE'oۨ2]d* '-0ŝuZIP]%N@@KuOi^iv2+9@Pj DQ>h{ + >JCN*^3JrV :GY݁x牷2Py՝r|?'+5̖,">if);P'h5? M9$le=?F+⶘QD&͡ q 'k 4z&(L" 6Gb% uFv:q2Jj>?P1NPNqaTsR`ɭ/:i㔽,.)t A0L$@u" #^'+ bfIH5Fۉ6+sٌr@V :֥֞*޾y.+ss<8|b}q1էĩ}<2OYlf3g%mGj/jw^_]OSox*^hUIh%$J뀖U2˺>SMΫ~~إ-Se}4>^jy7Z=ݼL-B?cbM_nO2&vjԻ˅m5V@PN}./ޚs!tdNO<- PbB]-.4Z"i߽ mpH%f^j{""C1rЛN$w0<*G7ThA4+뉪o]m3fO'U iV>N޵q$Be`GK̀vv sN/k}Ӥbk翟ᐢ(EMH$jSSyZIRfIurQl;Yi{Q V2zrffwrrYN2Hf'`N0WV֝'9@^LԼ|0֫ɦI=Ȥ}wa&YږqYFnOFsvZ܊ t6i[XmGRVI4~ 6Rt91(턙 ^y X0Ɯ,wl66{9\4"f6 <%?tcI-jZ0S$ ocDO.y q3`Ńy,TZ2y%ICag1J Ml@y1(j8Ox 9N F_\wsF ͤqA((9 1I<` N1J .zd WǔNo<2+\c l:z]u@++|nuw045D^vu"+!Χ@=WVDTu@D墶 @8e]} t@#'eTnTQ1Di@Ad".V 5ךROmp˼,W"~1NMcx=x.Ԧ0/wd^%aDԃΤ]}.9D'woM*(>hUYdiE¼-{c CQo.5 " 1ʅ<"($J $L&87@cB s`L8yr0#P[ mM~/ Ӿ6p[yRP8?㥨~9ƣi+7 }y?^V OXf;vՎscIRO{ڔ!N/UI_挘ȓ:[7!bSe2g]w?R6ErrRBr R8e^ q,tJ`U Jn)`_¿SIr.NrRK&:Vyy7֠?v:\U`FsT^3A%Dg_`d1t |@ge_%!N јR@yT(L!/YdK(xD 1yT֖;N:{A2f4%D*@pjT,eAry QFI{ χxKXN:Iۧ=k ۃPS^bV2VQ6/[wG,sD&jhm"Wco|킕Wcg}u9}&ioVrZq/p AJS ĩv!ڄp`9HfEy:鉶6S,b}u|Vs\~"9(nOYk>V:FYetrUkKUsy˻Un|frYN2Hf'`0ʺ:=`uBBqdbֻ菭^ږqYFwd?iI˭;MNn ݭ_zYDh~o Rtq3A.ŽaF9n1;6Nv=.?[o3{Rz7i!P~'hh ocJD#y q3`Ńy,TZ2y%Imh!ݨs(5v''|@.;тHt0<E1 eL'O8@`R:!M7slX+\,l:z]u1++{7 /j[ڣ IZ<E,d) NDX`W!t@-PjAw9Gt+61si?QdV0㩖>Ws ʔ >Y/f(DRARc"PbMDl5sG\GS^jhA\`:9#Ѹ*Øb E\'S- 3Mg|^CwD+r|Sjva @Fl/p-<ѾЦ&r祯"ir-o2DmInbmNtf\m/w⢣h&Tѐ0yg||8ȍ@>UT[B04  \;9;l:'U٭VuMZ5N /16ë̒q& ȑji_a ꀨ!\zϒ@4j.rb /,gŦr fs/F1$O Kۈ@;Ӕ0^"Y0rLJkZ\C3eKt`đNJbozRQr$mI]B.^ !1a^\'@%;U j['u뤅&cɘP+h Mj`$QS ~ M,H%;QDq]1x!e7v#m̴AiSADT&Zj$7k QISPn]-W]cEEOfyd!DɑQCY D\P ^tTV,o>۟#Ak% 5'Ji9h]aja}\>K="q}E 1qDM.5>0g( 샙5e~=%PO3z8OdMдj\ d儝\s%shb' O&ST_NTͽo$KqzB1梛k$HߟՇgpO/#ִg{ԝ~׽K[+JHwjM9x|jChpB㿯V\ (UsO[Դ͚g7#Y1S}㇫Es28[3ɵZLAbF[nnh(BVz.;w@HHث#X?җaX0L,QCjT̖|4\ =?p}TF6:ȶQ۞J-bI>2>}j+Ghvk:UrCyYgv///~|Ϗ?~:Ǐgy )0 Iy mf ozhC8|haUɧnŸ|q"ym/lk-@zaПtz ZM~]Y]C/̿F1RogYi)o ш!?ݸl[Hrrլb QJHvN2<Po%Yw C4<hͅW}/_p^Y.A2S:n$^#G'uZiw2o,s4@""&|&HI;#`N'-)6lBUr1x!ʏيzU+`.0VhQ-⭮̷nm^\54gZ{ʑ_eẅ́E_S`0``] >co#q'SzXvG+Kv3_$/xD.pI*A ^{^] :h@j1U!Y̑ɾB9-d̺gfy*'OlAVKDGͻG?e'_z PfrmeA 4pA'2Yh-a#ɽC<'q Zܳz(4XY\3 l+ f UdLDhPߠ'󢟉'>/I]Q zS76 tFCD)DLvZ_A FHtVMs o/»0K^Eo E/Zh?b;{} mi wPJ;Y̿{S6t.2>J$< sޒ\gAB 8J !,h&1JVFBR+D!YGȐ 184IX"t1Ro#=JDYlմ3=D0[bh}np4Iմ[EdžN~t?hiʯҜzࡨ#zH"cS ,JXpcZA @l`(}t.ҽ!,8޻!YRǓicJ_[G0&cz\I ̏~湺ZKS\ Kф Ę׾ﱄ2$Z~heƉ 4 Eq1b%ԨjEA` ؘ}5g &ݡ{]r'u]/gyDSsyKw!ͬ*4'g͙2%*e<]N o?̴8hR;Wc&xBRQ ŕO&).dI$V,jFmMIT$R=9hI|N&' Zz[3֜ݚq;J9MƢfSuYʦ W-} ;/Ӹ`>>t 4vTZYZXM3!)F<.kIWN a=lyX !aˈ*@2e6$L)E6jB (>dAe}e&cH{r  % T#~TA )DYg Ff#qFzE؇m8h$I1%EKDS톴dNENnu3?jX0{"4tlAx4tը-^?* ʱ=FuG0#w,X*bLHN?K/FQ2Z[LA%LdPXd8h%FԨPqu٭~tMYͿ{Y{υu+|z'z-G!/섆/JPŀ%i5TMh]-2隮C`.lH4N<Gl:r}gbMoٓ< >C{: y-'W]gwTZƎzlhZah~>j Xr*%0er^$SQjnfPCe&h^%8n(Ӥ&@;gfjޕ!-ك1B'*%Gq+x<+RX:Cfqb~ٮh,K߲ܚprS՜2a5 [)L#X'olHyș\v^wtG'xXs L@E&AH =@vJCRC|22J?gOKP֗.o78~Ç&:JEu a+} IARÓ!:Dih4h KO8e~H' #0MFyN3dy`B߃<` rP\sglϖ!J. +R* :A¹NfO:J)S3+iؿәK)t9B]AЅl;DYBKE`Km,}g9C- v6vikٔfl7UxK )#ONEAAÝC4wH5+Np%8C.5 k]4A#qlaj C钸wPIF'hM2OL!iyTN f4+IUQٖ]SVhyn_΁{b[ݾ} i A9~ǯY(sdEE2:so:c.Jr ^6pc +Kj 2A$Ԅ '@*U8˲:DrV,}hQ|Oܜ½͠2*qk֦M] W]khoKxw4]ki6/ztG2ۭU>]TQeĝ6|WxwUK lJn軹\>3xO<xS 8sZ;ʎnm<}׳ai&Kg54;Z t})[cE1i]_}5jfquo_~a0E|wVqRõu'Iiݱע7|?x/z?.s}A6#ƟGk٩nRN/go.ޘdLYw Y.eRzȆ q~ rWF ;Fs& ds5oTlʾAChBHu d bD2RCڍ*ic/R:s*!>z23zJpiG)0USO&w@CT.E@$Mlr蹗=zjcw#(41Ajk^D8pr?djU촽D??X]exn{%{3X۴Mg}уM֡lxmM"UX2֡!Jf6Ej@`>w~ Ho2Oսt0H X׼Vޛx[̅gՏ$9{47om6Wuƛ]Ez9m%Wj^߫{5Wۥk^߫{5W; bP߫Wj^߫{5Wj^ͻh-cݛlѱetl[Fǖѱetl[Fǖ[xDw˒VTĨ!=،ƐF3cE$1*L[)R61-*mQoxrJdQ*Pisjn“.X| K&iZ{2J*\Q9/Ub2n9&F]YBˆLPT H_o[svCĪCr|e%֗72kChNlj^n&qQ!rS/zy֭WTUEw5@*2ap1k/2dS}gȢXInoLIpwYHYWGڂEGCA QH ҩ+o }$&gB ]R!gX@t@6qcEY䖴D6V%N,g5gO9+n?lt9diHHʠ2F(E+Z$A2$WAZW 6 x>E9u_(=)I$ŞI΋EB:B\F* +6F#Д jٰINZ 7FW : *;LJ ~LC&Kk-D.ZbQG/cg7G,O>]͉eյ/nYk4Qb0/zþ:w#}iGӗTH'8U|J%Z[nf'i5fU-w҇ ML2\IsަSO,|ˣ= 1bBJXb &W\ۈ;2XJ`qSM<[y(t6CM:~W7ӺF?DEXWdy))ҍHTɒ&'"T{',^'%(N:e+,C) `*#A,[H@OuвZ۫f-r12ǏfSv<}yZlEylPޕ\Bܗ$-rj`&̝'K VWaS_O5I$eImز.gNY~JN0x?ޅ#N*&*X lB(S"0ꙖќSF a,:sPf[18(8Nj"<0   PΎt~JG^.>-*g M o3tI`!RĔ$$`x ̧(8sIpZ*?s:w~eLȔ=pB͙RZћ=wQ S3Zs9-\IotDc㈒4;s_F_~g0u?e9{JߟNu>V|UpF5pC9ag7BI)gܟgq`ٳ-RVQ "N(Ok8yn Y1LEqTT~~؝,jӞ ]sR*nl#Hf;Y$V*_*u-}wjC4F8!|P[BJMlϽ5Z1#kOٚHd0 ӽqoԆfA>xi;`a٫#X?u0}[Y$OȈ!G̻5h8ot8f?ٴ8>*#G/mԶRydQU<H>>w}[J|"hv,f*S4igqo.?>o~?~tN>׏~ Οp&ѣb8k/¿^6 <?І@q\.CmRRr->aEzgtk@va[#~~ KG9-du^ N󍏟uYi)_Fh΅ηqϹʥKme4Nz/b(/ld)b QJ4fa8y7/{mc?#4:$xbD Zsai=2W+)o v RPpSYǭU ^!I!tQwӑ|>07W&`&hZDQD<^ tBz:ƂtPE.N_M﹊zU+^Vk0`ou+{+R>2Gz1 J+@pP9lk*."ƾt29Tdr`5JZI2MaZ, J$*O.4*@C51&(Ll"*G9Kv2q:80SlOͿWݰ)}]f3LBc`_GWpP5}Q[Ԃ]SUi(Bp};U6Lu4N 7ۋ͊K `(BReVAV!_ (BP43ynGp7FQAV(y~^rMwI+Lt)͵Q\ޓkQ4NScp@ :g(@,BɞY-k"7)6̥çC˨9X+m.sep*skKAw@ad(-IEp7q+PDpJɼNth7GY厳^?T;kNRc _f}I"=^Vv'%FQ=ydA6&n޸ C6 9TDJFU !9GtH2=k 8l :%*I% TiXl:%fr_R񽲐{z&La{q?mP`t7ד7 KSya6l`;8K%8 A1T:I L#xKd+-[D%8, wZP@sa4< )D%L{(!BĺD --aO/;x bOBQ&IEyPRf$eTG @q(x<ౖ :N8m9lL%k4C#ZU:îWMC^;̑adJ؏hųSWZ8A"\ɋ1"E'U*TN8S%& fNEݹǼ VzE#lsJJMd3BDN H#KvUwѴ6Ys}boo⎸|k\aǴ^]r\[ A:fD8B4)I$&x#HOQp p!$Wx/~x1uC.oq%R'J|~,$mpfTDD%P{ACa쌣LѠ=0^Xׄu'qF+`+|@uN:X ƂP*_S/ӫLVv$%Ό\1 >yd!DɑQCY D\P ^TVSӲ%XdBl_Ւtφ%+jҒG+w18Ɨ9$8IY9?azn}ۻ샙Vi}s岸_h&ymM8Ob2h+89EpC9aD\`4tJ M=- g%olDx;p2ǍSttr(f|J8hM+[e;s((Ⱦ_5_޽ _iT9o6x1r +2_8%&PP>͟dzluwon|{1;_8Zg#VKbQNN۱zlWEj.fṂS;Z0!-J7[q{Kg]ͰflfY> 87: G0bj16g]ݭ2rwNvW*uiu=q!#aW_LQ*{ƭ]_//ە";?8oï1eG_pf甴]$3 FفMO[7mj[45lӴ*o.%% Q_YҶ6꧋?S+y^-[7?yj!@1oQ+H6YZ;>~nR?C<t?ݸkvw2;2;ob qJ8yvfey\ww9]c͡F<1"ʰu:'vzx|SuVKrlJI LqgVTWS/pE QPK/;u#bYq9;z&39@Pj DQ>hOG + r.H Udjg;ozcO9FnB$A x=2_wF]mt:\݂xT2dkrxKTcI՚H]qfSD tN+tfRarա>1M{sn>VtOY-4I^ goQ}2|&l'7樔i&r<ՖŹx0J+:Qk5쬍6;lSɰw25wNx 9N F#&I ɃPy@c )g/Ξ9'\bȣ%R l{]uW7V 9fujBM>U4E,d) 4˕fEm !Ԡjqͥ4J{Tt*:xg-ʔ >Y/f(DRARc"PbMD=LS^jhAZ!A&'u<Ce`S,B8Rq JBaJs?:tAh_iz6urrט%Tѐ0yg|I%4r#OV'bq ,iB/$PW.Fyt`*L-dd,)@)c \Kɲy\3iEFTr a ꀨ!\zϒ@4j.rb /,gR/ tF1${23@@uʝiJR/,IL&`qk,`8ҋ@IO/vBO0$pTA)drM:)KKD2D8&L+)8*yCbPφ:_'mdM<Ǵ1V@3A3!()ՄD&(8jO`oc NbNl7 ZΆR6Rp# E:?2="H[؞Ep{s^E+|wί/W#{t1($ R!hI .dJ`Bx%j_t|UNЋi617_#07˾9dPBȟ,o0Zhԣ1d DpeEHZDh l$"bOYO\u;H䉠AGd+ 5Q1CV FLz.\kJ=4q01AxGU: CbleC!$:Nh3g[{;ughd;x.%^jNأAG2PQؓ1 'K࡞M·%@ JuubIs!k>V iD#E(=8SFd.V%:!?E]562zs6=ׅu04 *14>%AUsMFԷsh|9@eo+cSb^3GxcӳOQ$\ԒVRD݃VӔ xpYn`TkaYT"E:RH5xbj'A%ToSu2EWEp0:z.|BIDL` ܙ!LD$4BhSq֎8nuۤgH#lANAf_7U A=4/=Cpg'ݹCsa9F!kcCs&gwo ޭ3[WP޶@zچ>N^mHeJ).xUZy;lζ`{ZNݟPe(j˨- 9"Tyaz}@㏯E?.2afիE ݽy};`G[N1OwJW,i&]9C"e`jMnP|ʾX;GU;.(?S㿎,ZF͹,iL4rR<%Zi9\#T-] #9WP$HgA-C"8ŸU*PDpJMbv-EgN&Mn`:{;ZB '`NDtF%SWX nzN \8 yg#9YV\>QBPpGI L#x kbܯxVqc6kPUIHeӂ.^Gj D%L{^.$nT֖Qp6jT/c p!pjT,eAry 1'5WX酂@ VZ!z5b Zy)zG]uߦ6u{Gx?Uzj2dz8)~V$f UDo?ߊ2ز(T}Qp?(Gi&pPo(^VQ,Zb~ Lz]:7^|N3lߙ7:#V:T6dDͱW-8Wσsk7'mOn% XPϭqiQA^,,I=2Ts%~[_ie$ǜ$_6weCM?O+3Y[rf86D00417+;- ڜ8jF͌N!ͬ?= 1څL Gbdx2KsVhf7ߴEi#Ѹs"!य़xr L.޵q$_ DEvac<ؙ!c(6lK=^r°ew"#s<.\?*Zp\-lq><m88pR a4Ulny&;Y6߭({}w~/C/ijo9pY9nTVք9F%R^87F뵨૯7[j=>hkUb (|k]ds}k`ZͰ|/uᷞPbq~כ"e[?{}, oٻz_}>f}"$sdeKU*j)tVުPg8!D9Lȁm(,1tYp3 ˆ}Ww6| ݢxtnH`h\紞9V}Ť~d%V'i%9'4jI=Pu2\< ƖUCAFWFf FVٰdxəGo|*B%Ɛv%T.kRm7C UE,L.*|筝9RIxC[~8#ބt}M¾]#t߱?)e}̦=sK.ۜI*%"˚#7S0UsFXrErtg:jC(*ޠLH}h k,2V%HkdtI~9[p8 Wu{iw/v~=dߋvݜbeW,ϴ减9Ǝoȭc1_ z(؁խWg˞s,C| wU;wSL:!T.]ipgZiwcc.ֽQ 6⧳[ZeHH_WGkOg? ~x[xS?uȣ[lv5͖7˯W*Tީ]Ͳ掵̦ UƼhύn?01<`Xfi& ò5TН3͗fvqz[2rF6|KP@>֊A@t,DOFFOp5.HdЄ[nه7o{iohsKhm}E.o%djjm*[4$&]fs.GU +ZdPj@oS؆JYTSjE6!=_I77ؒEa!-UmR2Bx*[d6T"J%S f&h9zFÊkgZo߾˥ZSJH),eU d:IqHj(cN"4=rD. -&Xj5cVNcltLWG(|M1al}dD2'飾fI%\ }TBȘŵ V۶ThR٤2Cd F`sBƹ n4fku&_m kp͆W•9!>".PF O,o՛r}yRYT5&#-*c`b$L:zҞ{PdUIZQb%rHT,QDA+IAfmQFj \2%dr¡Qg%Y{:փZ{+LLjh)AP+5YKBD5bo21[E`1k_L:%HdLCR S ^igd U!hTh5"aB V`*(HPjsAwQVm SH*,e{Dcs:& gd#z,qkV@Qlz`ZC*eq8V䅶]RX2!WDP EcȣcHm̭*7)]A`[49lW8/wgㄺXY ] * +$(!2Rޡ!MM( `-3/龈^f<`̄ J Ÿ~%P VH(,䑙@}%i*UCI$#s(xb΢ Ä#/( DMɷ&:cr ` b>XƩ dVљ'JܛKBQš\`) VTud; Vkd5+(I(lBՈ1bqыVR`h 2BDFBi3X.ȁv+ІWXoX+pՕb1I x34HWmr&ڛQKo+f<k1D%͈A;L')APO[mOy'SIR`QE̾ \{#B ~t)mAj xѕQ jP:@5Bը+W@ @v1 hwOHd.6y-hG ;f2]2r< ;,5w \-+=x̌Aٗf׭b+BX36J Fx>X@a} 0ʁ,ΠV=ʠs!z~)Q#*`H/rZvMל 9ĤQbr-DL0K1;FaQf5d J8$0_@t呂)` $ti^ĥ5Ԟhdeg,*B! fWE]ō! ~#&v *w^Df^2nŎeK]3)҇)g7pw}4oIٴeyM{>-Z'?i'Ei<- =fY=z 15dꐵ!km[Y&SSL-嘍o iEUSba5F"r94v:lJ,CFZ:cĦ#cpgg<ώy'<LKC3|v>;gg 3|v>;gg 3|v>;gg 3|v>;gg 3|v>;gg s>;m$9T/Glp|1bV戒xbs;s|͇ߖZ 'OD'fJChTBsuBs%ONhNHhNBh-4\1E֒"D"U#}xуM%׶̃ {lF(4aغ6hW/gso)]SZ\?ljƋ>8MK}5Zs698 !|wt{C[.қrݢ.CmO-A;IpWߎG?=by7-8?GsuB9iǟv̷>q;GχbΟc;6yDmN{ss'bZ'F:ȡ^'%OcA-Gg!aԿ\ɌiۺΧO[OP:Z%M ̒.CA9HQT)dU@;'ݘ/-ζ 8-[V)*66Ljc>mg!&\\7JDD|>\QB; + =^k:Xz[USQ29DV5t\Mf诨I˭l !wNSMӱ) 2=Taag{&af>ͽ{we=t癈L zܱ@[]oxKx&3bfKD#;u]%c]ᝠ5[ ӓ͆F7P/j'J'F_\4nǦB6Ya8Լ2hRV!…Q䍰VkTҸD6ڛyfrKL4|b[N7Iz&㳭zO;0圱H6:6+Biq?n^>,KPK`!8'[ |c"5xzȢ2GP+)TZ{(L!0X)>AhB#L4ʔ* W>ez2?nPݺ U􍴻TGK3TI #F%%pVM)*)q%j}|3y>'/] 8E%9W/7/ w`B2ڌ䥯Ҁ c Q"ǗK?z&*Fq9&&b%Mc{ ېcd;&g2q8IJ>V =6\QN+ *)!:m oɗ R2H`:U)29 YDrV-嬕1Km`~di":^&hJPaYT)DSIύ.0XPjC)-3Q =RBrSTe̖HϤ&tٌ/V5W+L'aFx2h3B3NL:U‰HDdh0 !+V8(B;Rup'{ۘi ~)ZKiһu.&ZXn`3"7%!'Ir+c}js /z|0XdW I搟[hćಉ1&dT䐼Y NPB?eT'$zQO<X..oE:qȡINjdfj*/&'%2,+d#/Qek a40 %1_[yȲy-rkTq燴^IkDwQFP Ǩ߁:b0"('ͽ9E@]eNҖ6%+X(cPwG˄iZLyfgd""ɔبtO[$&O ` $xU7QVLdY1yJf|/d_ W)K!4sE [2v_} -R6l̲ev pNb $'l݊?-WGRt3k6b|< UOk`-r?B\T_ 2-vo1i|{bC#$W؏4ΘF"kR4 cp?yc |{O$d`!HyJ5I3bRΔxIYVR*u j!U#nU!}#ݭ Y”cR| ]SAS7m &ڱ(^-n94=S0\ 6j{GN iP5.ED3]j8I6j Z7%NbS` 2$:IOphw}mVu ߫yX)^wckdP, Vog6_.V+0>0P, {kP11{߆En|w5OOËaGM}]vqq&.) wTsV2 74}QRKf;єKL.T?Kry&͗hY Ҿ\Lιo.BBmQkiu ,6:]+NWs0ӵllzO дu~5'V 07ӣq39 ^r.r F7FDBք -zYdA<ķGP]_n|tۖ>Js!葯:nJffAPPcp[*D,#u;Ϫ{UGqAz$($ɞP<Ԓ`Kh |U ن$巘 7q 1&<=9[t8G`BӨU79d2aVwp8ݴg8Shudqɋ⼍'l;j_:,h7%%ڂ#'A I"]QZU8x"JRx9=1ҁ01s@`:fYt))*T AI֌yX3nRT-Uʺ.ܫ.\PT6*UdӋm@Iq_  .PhU`'fbWb%\C8ԆJ`U567< XqV&W4W"jfή̄MªR&A9kl1Khjܱ@F (^2*qaSPTs%ke`"T+7ID"V' &&Kc*8e$]펐yXA(CшcW(+kD5b eq!J [ڬ4Q #J d$ eBs! Ʈ}aQoգA(5OyvGuψԹ8fL&c[a"3%q[>8Av GAHۜOBbI&"xm,bd0J߿XG_gJǐyZZ!:2DEp&! d:g*EKZy_'i>}݀6w6wuc}|]:,כ;N`l5[XxtSM;L za!XA/,Js,腅Z-Pi%r>"u[NF]r<uUUUrIz9J(+1yW`F]r8uUUUPTW/P]I 9& xA9XPKJP]@u ^R^='a({] _N._WYeGA53ГKWOf1h:ZFÏ?-nW=~)qx'x'x'x'x +N!{ĈOĈOĈOV=OKHO VуE|b'F|b'FOĈOĈO|4dJk+HF^>%ol qDʒ8fa77kWN.얋)/*L kmH_vQ`Nl ,B?%)RKRW=K|IQ=޵bq=Uuu4EOpF<Hp$'S;y>iR`]m5d OLνҬᙵ]Q/*aigFum=T>'r+i'f =>A2O>Pb[Wϭir;:F{njVb[6ݶwz7y]ddZnp39{mO/gWA9-s Ύ.l4ܹ]-)qI_&LʙW[?oJ_L L.153B{ L"] o&ٕ5*Nx8:]d``Z5FXbTiiyrJ] 5v# 5jg UTYHIԲeTT$eECED8D]1`)k bTmA,,h5T"=&bZ5FΞƑDSk7yEGzXqwd:}n>fr9mv8#?^{yPOIZRNq)!'HE$4 F V%أnH&4R9XI.Rg:EKbLx- U-kqW 9ژ6^|um>4ٯaӒ<@{Ʌ|_,Ka$)+i\o[u;#|D ~Fxwh86E; 3!N h"ju2Ahf.zIoa2lu+* $8@U!w\'r]g~t3 G: F9c>iG$a*Zz^sĒbQ;02 Z zjxPTǸָDpJ VD rwk޵^>&/g9ev9div `%JALӟ #K6kP‚7dZ,q/O'=qЕ$}ĹaЃ.R9j z뒑2hAJ \SZsJ.WYґm繢t䈢t8t2b 5n \ J@.(B\ S-P`+(5t dTɸ[be ][7BNdy\m|t &5'-ӵG_IԕH]B'eP(W4"t1q6r89t,&A Vh2 qVRDeQDצUE$.} 2RPL(I&rT@ dZ6j uh 'l˦yÏ ͔Yjw_ه/pҁ74P05 2(&N+ b<MB:dTg1VFo_STϦ:W.b 28T| .]iv37QJZoW$zGK1(g#@Kx*A9[e &4ΟFO ?,Nh ȿZ/ËWNUbv%Qg3\b+G#A[HE~kMVEnnاs!43i=C:X[3 A}j6})#Et|uΏQ`tY'7GFжw* ͅ [8F/9ݭ-lfrzD:<\V5 f''+u' <0^^6LXz_ՆɦtQ2mM`SM~"[m,u]N[ wa@x[.R i:CXq7r떯By6 f&n$՗47lXt㊠Yg&'u9+eY/f(DRARI-@2RJ,0v<唗Z)}VHI9EP`(:Tm߹9af֡GtdB\K/ōee2=_M慞k܌Ǐb\e/w⬣};'Tѐ0yg|I%4r#OV'bCF4u$wK NN#<[:IAjj2ޠE%<h:em/%1n]Kcƙ4"#GA9 u@T.gIh DTZJ9,gUr+&JQsdiPr'"x A$ FR.4Xip\+g)(X#t"Ц|<&ZQS 8*29I&N%r ""&iSڇVkPOI3=M<Ǵ1V@D%O5"$j5od8ZQGyOo̶V◛5nj:ϭ&zŽ "4_AQ̯{Cq.*Jr.*T*D#x1_'.%oJkGZlNÔpy۩:ʰIh.?'p 3BӨ#@'F5^_<^)m v $wqk%Au8A9-8 60sije XhA&E$ELDM#vFXNzZ`SmB uT+Z㉯xC^|*ʤsR إ>Kz&k5/%M%րM~@$dtO-aΕEV'4J@fk +d훮E7lgZB.,sL%i.q5C@L$]T_DD|ؚϗJ;PKP}PimNTGT8={H5󙜯Q WOJ`)1X+@SrQIXj,|p Wؽg8͠O<(Bp#!JC!,z.^>929;Y^H'DeDنzM9}nxWg9c=ul*٧N?R1,$ͷ2BT=:::JQQ}<|x/NZ2R-HqJ+`Yɥ0Z"5'蟜PR(~P5Ә \ѐJ i&! g[62Ncc IexZĜS)O7#g;:D ]ݑ_ zy$Rd}6)o\#MʛLVFkJS|)oJKQ}򹞔8$}h {SEFIpe Ӷ=9k5NfNEf`GD)zʀ&IN$w&rȕ4R`h"Eas_)C45bP )CQMnp2(l #gV6(@1v|;Om g&\ϞZGm;\wv'nGmT ͤ3k,~8K+IW76mν~}&iu3ܺj[osTotf{t-+lYMwnZ{=/o|jRa|}٧/P-˞G۽n`n ?ʬ7ogMw]?nt'R'(>٣l gFOG{eZ?phd %4U4xZ&\Ro{o*&QZg#!&i. O-f )#.*7,){Ŗs<Ȝ?S ĩv!ă2L,h5T"=֦b{v;뫛__5xiJi7G8CV.'m%Ő<%jbH:AwƁsBEdk#{ʑ2Wo9n)mW,Ksrīks(7?ҍ-ƪ!(~)UQ_hUYd\I0/Kj t{E#=IZ!󨀲M\YG XXB$(hN/о$,|52)rۑ9FevmCkF}Yw[^FUq!>^F{Io4Fr yilSpGV% gs|4 qKo0n;NcVI ު,aUV]01x$Q+3 C&Wr!SqTq8Y}ܿ2$7, k?6aMB­95L-XvzHt`/z/TZZ|=v󨔘 jx3ɘ,3mHp1%,x#+5W9-:ӻŞ*{ P2@E@HD83sNИT>\67Q ݈˶&fGJHsJ'^ո3ovD\z_I䎔iMM&z8;~*RA(*j+TjZErJV "ey@i:#'x\Z 'x=qV!a< ƨb%Ik7?w RSsKeh]-L?њޱy'dLqQ KSX=efo ο'uqVkcu>3)儝ݐJrN?uY=A!xu/M5l%ZqzBzr ' r-.{˻(3L*o6:ڃQx?mtpe_ZN㏋[wF#~NuߕcNЩ=ﶳY\v}_׷?_Oo޾L{Wj L%ٺH0bof w(547kehUra\JJNc{_wz l[i^koΊ7?i5P͘( +H6U4~EC?VR._Fh]~74!靰6>W1[SuJ4fa8;yї'_w]cPM \xe/m#L6 Bi ;* '-0ŝuZIP\%N@@*8 607;2U_NCwҜz!O,s4@""&|&H!'V0w0W !TC:N>};C#-B"N[($#Ksuⷁ'QW+1Wk0`ouxJЃfOPe󕠇+|%aLוQOJz _ z죧r}2wtwO/)Q"4&ޞF EK+HЖi6ZY]^J" 5,}Ɠ0u~1bgxk_IWx|y=,_sYke%YM7'7<$-/AVs]$4?O50۝jn}o kaz)ʅP<˧3{xbYWGf))6ywSǖU{P;Yn#A*4lbü -!2OO?\dKSr+.^ҭi 􆼻-H"'*LvTiTqEt c{ESuz{lTont6M\' GK 4m}{Cɷtmn;1G6uEVv@3u<{,TZR֙G_E_agmfųKށPTݟ8y 9N F< &dI xB$}򄃁$: (%Rhi^$=@К?L#3 ,G84ȋZ6:nT9c^G 7@|F2B=\l2dj>H%%|-f>0p:!)*X,WR 5TR:5/>~rwdxǢ(䝵<*Sr$dd1 h N>)%0rK>H+$䤎Gv( cEY/V*N{[:^)r[YRH(RW0M{;g\O#)&5t0R|Pؤ#*92)FnSEՉ@idISz-ƹn{ANy$pA)$I8uRzps/M0I{>}A-?NZo~1i'cB 8#4ՄV#? '$QS ~ U,H%{QDpTA=T5x5mws aѪp6k QI*(p7Ar=܆>@R[W8gQg;{PF6 ,c PH>:IB 4E2K%q!=ʱUT95ɍH_ ud].=`q_NrLx87`08@a K(tPGf֗U_~Wycvr2!|i_Yf OֹCn`mڹVHu&Heu3v=eswd[{)uM19[\bl yMI P4T m(:2$>clK;mʦMYS6$P`:jk-5!8X$J&E˘l' xbNzcpe%)Ĩ52)Xhbe7wl@aMQ8c%<^h]pJzbM",pe]Hec\h&918a6V;E!2^ hP`H1OӳcrIBsMBxSHT2ʌRGURȈ a|0^QN=" >jCF <Ӽ "E@6Bj8L`s*<"B Ɔ5D(A URj"Ј5@)Q#`5JъPD*q ^d{l= g:yo t3?o<8ƨ+;3Rh8C/q| H9w n16&)x*{V -uj8pz}k_ʢW7ۊ΃#> ʘ`.?uœr![yCZA C#B?|:ö$q7?f}N!r@ )B:u^- xiݎYC4L3q`~^ۚ|vlzGb睺~ZT<'U4R$v\(w[ %QhS69kEo_fUW.7>1k5R`1ȁKJU+k ˒G 0ْ$*E2>9]-NEp7qPDpJYT;=(NZB$﫶 iRbrrGɗC`K\VogVwYlć ~-u ؆OCuaV5$C 1m` "#'1\&ʔv.C\"I+RBrx.uH24=k 8l :%*I% TiX;=vWCm!/l y'>ۭ-u5m闩mO[f {=Kup23OOy@(֑p+j pg#9Y6\\NBP#$&D҅-vn< +XbqՆVծvghSѩ #,8E@l4^x~`MbقIHeӂzȈ iV+ϣB["ED6 ?ȉ"NuTtbo[>'MnC̆R,b18"QTX-b c~!RQ&IEyPR6.8!ފG=: ϙU%N\]Q}մiiWa'Pe͖_5Q*^Uʗ[c_k6ܵGV7޳r{3^1kGS]g{]G^.X/{JNJ{`,W^akڢbSY.b/޳]x[Q ͩ3gզ}w=ogV??Z+&rs֢ .f?ۻ~5lۇ)1|{]c8fȜ&1`rnR X($Bmڴޛ_9urβ t i$ͯG.oia?Ǎle? ~^W4qnWuE[הUŷPvKO [) Ķ$_r`^H%72J2CM7F;iiIi/9 3KrG_V,gg+w'^fzW0E|JJCΚg`9w0N+ 65:=FJ-%,o߰OY퍟vXfI*=oܰ~t?wvϰQwtxu=)[o+1K(c8YnT}(Qi*)6*F08R]Cb7>nDVC>!C O mCvW {?]=}(¦N>C Ϗ< Gvk!}&m$LpSBm %.Pn/H_WS;,Jz-GZH+ieT|V>GZH+i#|"GZH+i#i, $g)F KH uis&(c ^'b{ |w MAl)YnrFifA{|hǎ2'z!D1`;9/#7 \$ t"c({ ɡQ1Xj##jkK3J1&*yR:ĢXwkF 8-ϮVNun~סs˾;St YdzD(MÜ4]q&B_:I&#Xa-rl:ɗVm9& xi镍V5@q8eDO9xx; :8p%-#s|ҎHT2Pzq^)H,I*6!jlRI޻NhOC:E DK VD r=R/%yf3>-ؕ< y[}+ݗNEeG7^w VwTL\1zۻY֕ic8/qp$J@נodô XgN^?A;& oN G<> *3z$&2ǙI p% ED 1cARV$Eul #V46TR$2 <#:j4¦ĜS/SRXRُ֭3:o(w۱Z3~wu@5booۃ͛`>g>Z?[*BVo~ܕk}5x2va${1"^ {{5wDCiC9 ùV+QKQeZnTroKiwx(5+ˤ]Ԏ^nގrjC[N'۫~ ,v`<Q0]QF$y}suʣE?ԁ:PJe?o̍⨺jn/#m#`o~±P93 ԛ_~Sj<Ǭ2RW**vʊ#ueO[.W#v /~F.]fIWZ2-HqtЎ1+4^zh}x r;oA.(ނ\FLX@3e?Kp @9^MJpAjBΓoi=/ry`/@cקۏާM\LNK(G,8eB/B*|tV^rJ7xқFt|5&_j{>bF =qDD 8&4&g;%3g" F- /OVj!y8췓>-ffmvO]F;kG=˽klBέSxynǟnw79itnoexvf ~G-߿yW-9}{{cG{Fz4`K,"ٻ"ޢSzS4q|8>=Mbs_ {ӗPKdz泑6+ҏ""Eݼg^9B %Yq_eњi.KtKzGɲ7=n7XHX{))%p!Ŝ(aHI "@;G tSC`ûC{=]<*oN~.z\.KGd|\{虁ՃAq1q*^W{uv*}GԮ}&h}";Ҷ]ȸz[uZ/I-{IYeUI{U"/aEӴj1@nIXyX"cd&CeJ]#{v3Os5MDfʹnM9e:魯MzG:fEQः+36|A#!iB_qI=QeyULk)\peB5v&x\o>L//}l{09L3x#M Z˓/5 ~>glwh6ZO!O+2!I1P0dzIRR3xn4BPLy ٞVˤs;ttYЎYtH>z3j)t1Z]4NE`Iy)0-JM&s\]N()yaRh m 4!1uYf;#a33=ꬁ\\ nЁAWq?w繯`j)L>}𭛠\/()ٍ*FpXbVEgV "pJȀ5$*l}ΐ"U#Q &MnjB&6JjoFc!'Nz0).R٫̠1DFW6X$ RjDs$r0H 6Pd4g&J\ Ү`ԔCXZyQp8Gx-0Q0[Ն;E!0lJH:Q'_)pZFe) \KPKȳY݌)L^ܹWX& KsBPA*0A4xb!$x.nxYɵ`ͤD' HwD@4:&U:1UF03,*͸J3  +.d(`0 U?:oQH{A  Q0E#B:Е~=? 7#sSHh@RfU2x]tTqA #]g0s:CQ;[ mމ}ׯC0dc;3~3\C)@<^ I7jRD4LuyIa@d:f%0ƅS S9ri>)cOK>'OeQľd dv8)8 tx*CSo)0yY7,K0 8+:mqkK ݒo`A,$. i"ce:? hdp]Fꎾ”^%X7oJIt||4=8R <MݞZ`CcI%3P!arrkJ3գjx 3¯e꓏o/gwbUrF̹c0YT߮\Xmewgf4[aa{q&Ɨt6 ii3чѸQi&7BsUGg%hCX(Nf54Is`'.?‡8WwX}S)VTO`T}_~ɇw?~H??=?uoN>| Z/@1$z M㻟מZ#0_35UleO]/b#{>>thn[KQ%#~Pt-wҦӭx$G>Wlyuנ\}/Rş;Q`ClfŅ?ݸl;+}z-z,lɜ1S4 f1F߬~rm}%`?iT`#)E|o@?c|NKe#;,#0"5#H1cԍ#ex+0/m=ut{X&4K|eޘg8f8bJ{!XVsÉU*Q~MW'|M^|nC,%6`o,V@[-})GKW?k;ȿ 3*g` 0dsud gNn*D[6;t^֒3I6\!oDJa&2`E% ҀZLS6yA&NC64.ltݛ >no?uĂ|wlyxeNc9`j)cXy=yT_1%LIBV$raQ1Z޾NQ4epy~^~K?Ԋ-ݚm<#3>}$-IeZ&'GNo'>p%/?T"iț_GS|s8On"Qsdݎ\ƞp}~;j-MvTʹa ()eiY3sg+ڟ e\Hc g)#=2CTcp|jgCl 4A)jiw\/եodV`Lм3Jٻ_>eJb/Dv3f|9AJj턿毞û׻)M\ N`=2`ss!*ˈ :+*mGfGEߑ 303JES=luJq@Bi,$)V"'^q<v_we&'ރ-y;NHA)fD ;.Z)QO>M7A )赗sSXaL4k#B@S!T\qal%bHLJrXER%V*%dG(o"ЬyiMOIƸ2׮xꑬa d/q٠B dE\#'GD=Sbl{>Kb-fIY˃A3^Hm$(iQX]Ⱦ?~|[5ّe(7,4R .gL`/rΨKA6+9--&FA qI`FEBRSs@'Sa$6(pDBA֤mA,C"CIEm8ʍbfX]^|zfdMT?^gx ?7SnnZVZjLF -^ȹev9AdÁ]mb]Lq*.)/GMU{nVbR/4d4^we=nz%nvyL&1 kRqzWalzvVB FMi> &r)pY=2 fg#WG O !^1&J^ZXz]ՅI!R6aa`ZGّB%L׃>ȋ')%2ֵe+jxG&DRqSLù^NR=&-vLSz S현Z}oqV'wa{'mb2!\gwLRjbSգfYu71~=wLY)wmqH/q6g"@A^bd( t_bE#yF(ijcWҬf#?VMOK8 4dzK^لAi 8%9T,d  R@:NZ0Xu,O"'=G;X$ch!fojjR+tGj>>,9!aƽ7.!ۓM|lD@RP:wu| fe 4 (zW^ Mj>QF+X gaJj*;N$CE drdP6; 8'+VRJd 't!E}ˡLpSvADy+q6]ZjVovChc抃ErI`1Js P3(HX4e$< &rf*3 8\GKP|;ljS-}NX{4V_|&/\LD@;Pɘٵ2dEl8("x]狾 *?/6#hlTfXr2XP{cr br,uQb@iDIQ !mZTl~Xg#:8O-p蜷oA:Rrz` $,5SP5?~^}-^>ukROM `#PᱳZKq24>\<;QfiYC98jq7kS؟YZD*bv~>M+uGslxfkv}:Yе8!n)r~nMC-^R NJ!t&5܇[4u/cO?\]^xjœNetz|2[nviUmeߚI/&wDA5#m a=byx,H~>]'zr3}7׏orݨUZobI=xOsV_`ՄW9+GPOOcNOebxӏ>o|wG? \֎D"i'Z ͇6ڵلo0t-+qBohOq[{/W09`t=wMdT|Z ϿbM"Npv^ޕ[{cx{9.X.cݍ{VŴ#&WZK>ݜSy)<p8OFm'㺍?HgNM,2`^tAibE|~,Ky۠.lOJ(]1FPNtV*Z xPed!fe@KZдv,ˠ,`$!dهlr&F%_Xߧ`Ꮚa&N7Wf+jUwnVUVJN_V/P대LA\L>=:u HV!}nV QGz:rBK\M޿ԃÃI.YO/5hʾj}I ]ͼ7.vоD/kP%8}Pkފ0ýB_ : +]hXA)CsEvP(u>HΨDTTע381'l.;jr;n3NIܸa͚td_eErVm>gP ӵqȎ^);R#l U\.[urUJ/j#gs|ͱ\ʧ/uYIL'{ԙBݢZK&(J Z_l[m$ OU1G2\AN!C&R e%'rhj7]Ǥ[y[mkĬwlږ~@Lr޼(vtjՉE(ZCB̥S"X2*CzMwipXjH6M֨pv?g4سSvȴ^'-{4{P{S2Jkw~b(@+%}s`xv5⷏u1O#:yrSƘR$U9dKAG%e9w-E;hnfaɻ,?m}D%IRA's6֦~:Jؖ UtHFQ)^F2EB$ރ䜸+ӹ٠/EOYdM/sONwfze!Xո;ڶm⺯e`;;o.c5wKou P's^} UxǼç+ +uSdI `H٠VF]6!h[r>1RnHCY]ɹ$*sz]Go\.!)R :VZ4 ;֞8Ÿ7Ky_L3/ԃ/|V_UG:Z-͞Ng*VO]οN'߸.XB9zBP,׬$a-L}'tzlubVBnS[TȮp. ؂T:c7ay \nzmhaC{o}G xr\"ƦTX *ƭ9 XmCb3CY 595ZEʣđj'IQe,A5f<{P_k{\fqWB@+`d9P) kPL:6͌L<¸KevDx8i͚E/.+^&ej{q_1eg{Rl b;p(xY;iRvb'$'ݞMEtHE͉kTo}6Gnpvu3k{^Nlðy5e-ݴWzh|s[[r3?L -nN$r1mx~Џw~CQ3Sf!9k.hm]ChNNďln@iBt}x"e >6'(O&Ïs"Nyu+jӣY%;1-D#F0FM9䃂 jNE!AeZrv[uC Tf59>K63zWBBmA%< UȹY9+@vloGb%Y>h>`UUn_rvUj]J}ήo0gRJatyDN'Gxgo)6T4#@8"2hFʹx=Njv%wE;x)hG5N7BS\uH*KHgE1slnsV*`4.#T] TɻȓJ! =yx9 49[T]Bw,|dcx+ՠY͏>6ꮷݧ=uͨ{Հ^6bzxDI+ VY6%O:.LdtʆMנ'Wp|Jؕ*'#He ]RQС )&!tʅ&dtxƒF fQ Gg[DjbWtaG"e`FȂ]g銜Z+8X;o̙'[AnޛzIR~M1}tu|wotc2>Ej&_Xw`h]hU!!u$@Ck]4҉m T^:C^:^:]jedlZ &KOL!iKyR E؉YлdGUlmiSOR/S]l*~l}W[5>zLLYQJ%ʽ ^arATjQzJr)~^vW 5K`K:vK V 9+70zw8x[ԇ|}y6XrxJ贗q1XSPDiDB8s⁢zfvTL5 uA5/jRRWS~U)p ''V݂s\wkϣfa4+rG5,3͸z:.*! !Z^߮깥˜m&D©$* VoHR;G[?l_հ/[ҮoJgi++&8*hE]UjuuU4moR]YAE* R_9s-ez?Vc{81=Rӂu3 LB#, ZæI¥ AN3Z.\gO~X$޸GO֊UŊ?4Ɍ(}A\+o ٻoUwk1jM/qRRijR½n˵tB}TZ{GV#3PiE^i !my(+YYqGv;H@CqGCSkb 2 ХbQE<]2I05WL@Ci@xr.Gy/gźg0zE9&&]YBC4(H b ~:#PCw(Iݣ_% c{vvz^as6.g sJ~y}GuSh’gN]+0X 8W;3/J=qc\\R`B'$,kl!KX:#gK9kW f-mbT~$JNeP"5&A2$WAZWѢaL^؄"^?W['RZe1L.EB:\.vI'T@劍Ѡ1cojufj'8\x&p4<}yGJ@t.TvDM,)"[VoK^qt8dWvw,+D| |6gC֪ >%S-/7IZMYl(U!C/myDI>֒_]X‡ "[FBؤ<"D@%校PrV6ZI 0hTKţ;87Ktt!&ߔlZ7ח&Mn]w}~\ukHԋ׋ol+7Wq4Boѭǫ{C=*4ywDA@ ol6$kXhrNW{Жy.vSgVJr8wެ7.bv-JH{DT&PXJ_=@2KM`hG B q*InQ*8Q$Id.R L*gؘ 8(ɉD]#ٲN9`k^J7[9YG )iP %6k E.`3iɜ0$ A$x?hٴPA/ H"%K$ck7ST#5%"{`Hߜm'< U3QkN'^XWukĘQ)&V&:PSRѠu,BJ$_ X[V=Pԭ ;\;R[{^gy0/y\ch]lxf4\MY="`Kd$XFU'>f:}UI+,%Wɛ%)Lfp9=WV,x@ <8dv01ѸQuN>^,b|cU]W.}y7a ̏'vYNkys϶k /?"ޕO3D%&"ЕRl1h*Y ZQ<(t6Fkw/ky@-dMJ(pl@ eV'19 x<, `V? ,Z}`@;w֜(}98e>mwVypPOs& ]GB0Ӂ\ιL޵m#_֎.U $14v6I8~J!2!l-d-Y%\X0gīn~: =y>(U)USh/ EDTB*iPIC u2r y*_4r3 C6rb8Yv2J(9'pZP&9ؘ8ژHaTK쳾٬bh&b>;KG PСF)j] 䂵R xUhK솾woIG}DV9Z+\%_]Aws6;̮egTOx0^ r3hOV Fõs@OIT V$^We*uxx?dˤMU5Mn&nIքc*x%t6r2BiI R񊩡q"e匲@dEk PARH&ݗEsqnPF9LrB)RxUD ujJUI V[ L^[KJRs%qwFJCNJ6͚s5Moo&sfB۫=fHƢ,DIP\KEdT6Ƈw 2">VeRխx"Ld (g%)uhyC&e0,B&J*Ut W%sN9[B(^ rWϫ}}$pAS.$&!%39q5KbvdzA&M>sIFg~'k?L8w-lAz{fPNTI3ww57h}#ւ,uk ^d.+'06{w.I?;Jr]C -]V z`NZ$lUCU@4TSSL$$ݧt{_K̑5fA8j6OrŤ۲ʏ&IZz *O/\. )/x } _~7~Owdնod}~뮇.*.~8a3a@ml]-\jc}B.RRim<bLN-KR}p 2}5 8 2 ?D)HͮsveWMxV,zI#' s*e'Ht) iγȳvvn^AnN~,Uhf}xXު6Y VNNDdA`2χɫ E Z8aRMP)݇wO_=k\}Cî=_N&wgvu"OFk8\/i[khmti:JJrpBm?a~APR%!dT5DZe9`{g3=u.9 儋NScY"R$J ^_L2Sߪ^Вaa&'<}15gӚ!Ef 5dc!+R&h-ZKSK ׃^oxRO;[Aq2D7Vi΁4SV^#bG}#&'J Reox X,P:m1R%E%! Xm5#lc㌠/J,Rq&HɆ'bvDI+nG^j͚s2Ցqq<^gQZ\4qt\츸H,6J^AQaGtpE)]FmT`[ *(;.ǂFk6C:<<0\.ȃgq >j'e"iYq={DZ༾˃`LA.dsz C+3z.䛄7{ oT&4D3/"Cy´(Q%43k:OwmS% }I* HS$%Hr8朣bOYc.]50%|Bc|e G+üjb;U≴6/8@}teTc0|=dgC1fZTThNutCڇh6$<):$m%/:KBT͒#$SJ1@,P6a 땕^ {-<~z);4S c=e㯿gRSRa2 2$?ɐ jgU,GG/ՈZTDe$hfM&D,7/I%T:4G{>mϓMwQg%e@իUU;~lަnk^HUZ_ Grݻ\zҋ篥eswNg*}(?Z\K\rjuo5?>\J5o?nnԷ~wL3d̿bՓ鹪@)o7a)ONJah]۫];=yn$,"(6,}l `B8u?{Ez@-N3NhTŖؽ#] UBAFWF&ONUv+QdH0RĄ7B*d hG,`\t1Pt^r ި*tI."|ؽYs^@͏!~g+ۻW{vfSOiW2<=#~>(ҹSI!,}<{A>wG"12WDydb2ǐ#Z\plB&^#p 6X3H!$<ԣ !aD!$#P mMP"VZiF([|F਑'bx"bs Em%} 'Ydv$'ɔ4c"j&@0"'\׎[50](+npyn?pLriQg}=ilguۂ4)yU\WOWNqp\8xwpVJ:8)v-r# ( IYo/QN86IK^ɓ!br ^@Ys[Z܌dq{]-%SdLݒ˔vKn-%SdLݒ[2uKݒ[2uKn[2uKn-:.v\Lݒ[2uKwdH4ݰi|ã61Ln&W{?8pِX2s!K(C臢.0Ul7x8qX8>qH8<_]|.pXٽ\/>˿%5 !s$eH)5R xu R9]D(:*5kP{.i'E%w'|Kjq=d>^y/6LcU ]0_^̹\]+yMHyAx }Ϳ_7쮶150G/gf.LǺ ooפ's<)֏Ͽ f'7v|Tq9TQmy+|z'zѨ^Sk}y W y>H^Q c˅/bZǁ3ʠ$RӻE:{΁eCIIbr^,dElA}}21#LMYb'zmގa^ =dr7:ֵn2-mgZ'bZ)g F6W( j=vJ zS v̓.0`]` v1נ$ - wL?e׃[50?{B|d#}a#ƥ޳-q#+n47W!dORԩ$/R*&ʎoIQHqb@@ntiYy_ %T[  A ^y&G /yVd:+ZGMZ۵}L'1"oլDqi<.1=/RC O\jޓ\3@b"X r) Rb"Z)Pc"0&R@dDҨ~!}[LFB" +N eMHv= }aGaQ92gy*\ Mmc%R`&]mpcmUOmK},XFw֫ƦcK.VxOKeR `+<}Q Saaz G^Q;X%L<iM(yfPeZo(Wlm;ta+t_ujL3gQ[1{ԭ>|嬀 Z]ҜfEhe>!0oEnn_FW׷LZ l&\;|}{c'o̼6r?>L'=-O[?&y2qtd\OpMmM2-bW6iiL9)sTo?Krmg%] ]Lk>Ƕ= B#de0ʂQ Yvmv?Rݔf)y=ɂ]srTRqTO(VnQ)& hSXDmBd8HD3ef;;@}].?? ;i$O/Ï-7lj'ݮn;M5>].cl?Йf{@kYst^,$7Da B1&q/#S 'b u%$gKvMZȠ#ʁuZ3pnzW@|;,ճK_rnSMvEoY>eI9/(Z/'+Ah9<}+̃VNStIJXͱ..I_ZU@Y9X 4B  ƏĎT8"!Té{B)W9{CvA`Fg8DhJβ3֦up @i2Z#rR^$D>OhHH&=:EеE8[5Fe΄0T$S 0ͼ% jvE^ĞxqytX蠶&q1['#DDoCRTi̕xVjnENd})3ʭH:_tv@"Jst*Ý6X i$ѩJʡY>Gs}3:s,~ dmN7XR5@< imP!@!.Q'YgHg ]} ȬA>DOAK@HTx0𜕴AJ ZŅ6s u28:f"0z>˨ό9GgԚe^3aF-& ڄsՙfN%>~Lq8ӻx2q,F)Lf$&JL( ) oEU8:au]]K[pm붭4cWo׀ӖN0nH[ŨsqIQdN&L_%'S[wjrǥ> EK|1-2 Nֈe@c+1Z4)%3(rT1ɣ;X*c @aȱQǫr^E7}_T͐EwǞA ̠k.HHs'7N kX%5MO7(%r.*0 a+ %}Jg=bfzŝ;^Ǯ)cǦJB] EǫHII%dłTe€ -ehRxRVF-ԭv|ҀԚ[ E76w/LHI-Ѳ]k]s%^m@r~lHa$ d>FZťA*IJ ZJ"E9I o9RIh5\0Kk uqc9L@ZKjS GXM(QDrDI Y(@rQwWzlrJ4On^O'QOz3mSLFPh$!kM&r<&FIuhDffbo0GgA8T 9JH FeA rT2`ex=!uKH}v7| QLR_knO6jޑ5Wy &Q)i"N2kn}`ޅeP!҅7S=MI{DY <1Y_u5'ܐ1PS,.\s"NI=#&$l>[|cI rX:L\+ dΚEi<ջ2q$MnlxypFҞwNͪiQKڇ(NAh7D0&esEDHTsJ=H?6ΟZv/f2bLIã)I>>/^90ۜؽ*.6k]{/6֥@ZHYXc2 e꫚W)?ɼJ&k'1"c>p<< Bǿog?߾?Ͽtq0go췟'e] X`0-îK ,u{;oՖw{|460,ekC@Bzð" DJi%5 S qůD S4P:TFa!t)?h]B|#rMlV?< guS$g:vI 3\Dw7Єnٮ_K#WҟI5M&,fedtO]}drwr~FSQj3fYz #vG ,JVH ]DR/}o=wd2MYyc#0ɔبtRP: d#i(P*_3JzU6]}[i^P-2Xjx1 VZd4i8ɟKܘ`K9!˪[dn$.$)z2JM`Q5ǕHH֙y|οyP>НIC}l1i\@˱M/GA2 }_49aBf~mWGP5Z㴫-7( E&Иo%&W/B+qWꏟaīT10[y d<,C2([;Yd{z *,VVK1dN>%K~ܗvoVxfV !|Ϊʺ+`ֺ HVt.e"q_Lɉ p 4+YnE[ \t/vK?YA\[H^;iXt\@ƽS>f9nƤ@*3K:`-5fBXE SA<)5J`py=brcwg&8} `qG ǪqJ>˺dךenӽ1Sw tx`y0V< 71HT0ٓ|hh dR #U)2! ~)R_ȏN: ϐW"\r6b{Rb/Y4] $ Y]_zY^FC` /gq \6m/K}OhӹA2^rLf`s&+I0,U5P+SIkAY#< G(||zz2K?JeZrK^pQ$̝Ąd%M?#;;\I?6|"zZ , Ąy'km+7E藶ކ m 4[KW[-lx{Y%_i8:Au83p8| Nl_|{{ 6(vrʞKhBt(UTӬbbB*9rZ6|^Wg`,]G͍GRbzBI qamUoRWY[^[(s7 O_)kԘM S_FߨTaTᑳ:UrAHńJiⱺV)NJ.@Z:-v X^gP yhb~r zM,|-xO@hkD %BNl8kunPp%|^@{# .[)$'!%38C\ev"ZҝAA^R^Aמ3WIڀV6p+y:,lKӹNSCt4IgJeywn!(=q>${<Z׸u5ۆ%]i7e?_Z-vGifCTF_dt{?Y"ht<3;߽_O/-iUߴzu=l ;`Qպ[3H_繭Y ?;Ԛnuz%HŘjOL*3oK{*Z?psZzOQ"e5%; )J4IcD؍%%!vs?֥Z`J "AB% \$*+rymf l4dRc2 e) h1+`VAyc%(}VƯ6OIqaɷײУ; UDBL 8/C2b62d4S 2m=$ogz:C06+CɺXr(>šgNs įT(iEa|9)E!ۀlĈEy9[2&d)8Գ^ ْZ9zsT5VgrQka09!ylGP^PBĠ-Us~_{W$$dm-SV/112tqBԮPkiBP8@fP+ aIOa HVAg(knBG_O_TFdH.Vkp ٜx$mp{sߍN_r=t(ShC<<ʹOʺd*O ҇S;a.rWAO|ͦc1 XPC=W#j*I*1E%G2hUF8^B h+0œӯ}юf3oڱSAЋiu/o^t&+Ro HĹeB:qpq\R~m`eBꔔ @ȻbeH@i&3&\Mxp/nc74#6FaXVd.,0H#`N:-:mIAyjuO^]YUĺ2?%,N1i͔ ^%͎yE ]7RLqsҕ"љR3\KA(I A k9΃.R|%e$?E ,60=13!'SaK3qSr<,+Y%> g@gQXʛuR\:伊`>FtGeN̐,FGo<2}~|zZPCMZq߸5{R6Q/W g/l(atG5ȿu.o[g1JcxRrlE<&\Γ;J-=۾ޓқ{: nik7=+bʓX=r}ݜ+{]^+ҸuZ祎Ԏ5Oz)ۯ2}؃]ۖw{ -oX!/wǕo)F\$u<.Ag?ukx\AW5ד4J˯ج(..1oץ"c./ !?ݸl•L[HO!?'\lNu'֧3h5˝%m6TT@ M3e%HQ8:`|rrJE72cU\`+2 7UTԮY!)R2 ,s Ҵ,Ơ( ':ac"^c}~47&TM^+7Hc :˛^_ŀC[ieKgwP7}>AU.Z #aS  wZ~rk\VFDUD勲.xE x$ֳ["-i:gxvү,GsO>o>{.԰ԏq}?xzlw{*qM_V^ײz{eo>=%//oD- l"ЕRTH F,d_I0Zɯerb\m`v0(Gl6Vg#($(cr x<,$b4 M6"&!>Y# ٲQkoGyH>?_nf[~~nl]DVBBN+p&(s. Q dh &vrmexN>|ka)ψN8iFHSXFA%ɃYIebiby*\ktp쿲xZ/cDaP(9'Ef\xgoC⵽FFY ʹlIQ^[tp`.Zna/0U6W)O7aXΠ缼"lUMcJcMh:DG--Ǣ%{׶Fd.SK@?a~ yI I[Y$%JIJږYRd91%I'dRN1Sii:Rn5kl9z k/.r0)>'^Bi|S",W"k !T $[S}^oC{}LRXё`qbRHNmR2=V9 |4ee?omTDlT3}G_9*P͜>7%4AqMzcy[Zn|eң/[TM_NznU6x{Gg {YŃVʳix^Ȭ]&+>&J}>P[6mJ{xȤBd.BtNzWU?%ZZ!:2TO,K1ΙI"(EhcggqoZ_'@w w}yY"-k,ɓmݶޙ\=[Yv(؏[N;/ ٕ`V췚Qs`?~f["mD zX3)]融)6t"=|bKwk]:pұmO C Рe?".wy7lT'̮eQKALgM"y)Azu@L@L%- 3ŵUfϥ`b`* &$HBVƚDA|;A\aq:%ja6 2x1Z#s;+-5=W T3g!M SzkއQ gi, 1 Y9_ۯ`P|Pm9-'lCYSLQ <.ͅD:Ly1ǟe&ͱtKӤH;V']v&ߛ5 ÷(Y7ܪI<. ៯wfqUTޮ@a![jqR[{4().(h.EqY8qL&QWIcӒ Y0Г F꘭ѥhrSTL ǘj#c5sv53BQ E'/>P6dxk Ӱ8{ro n8o7*5ZhL(25+B@jcVThdI)K"0"vC!{.r*J^AFQc3RAʈ]͜ǣB1ۂfǡ Qz=؍j:k\'JBD i!d-z8!'ZTo$dH&&O#b~P4EeIuUjlÆ/ `l "V"G7x 1‰|*9'q'1C7 RZhTUDDOր8Ifx.8J b2! h B9TG\.K.)Ym2.{\X`gg%ipF;90$ivڒ, *d=.ۂfǡx+> Oa[k#7{h`'^wN(Y`ZeGMAV&t BHH8zM:dtbyTQ 6.wj!]֥H3"Gj30!+8 ɚ1&dV\l5s;QM|&جmƆ/Jbo6 O;'GgR~y]>R&t+jUmW@s9"@CBP@PcX}8~hReiT^if-$8%HJ E`)lТgf':F}(@5mѡ4ڊCiRH`pQ(0RdFJAK( o!S Ge Yr V! Bb6Dnc3%e}mz(?Нl-~?i=WDTK!j(K1Jx$<,лUν-r/y+ˢDέL,8@Lѡ &Qdmܳ olwsc7uOO~yr07&Eԥ uі;TjVّO n2-w@qp70f7% )K=ԍ/=+~kzLPwMF33&e9ڞp}(A($&eE.ZT]!{&18g-G}6Ӛspr}gu\{^ uv}qs|S̩z^30K@%hw&d>Rڕ.Rֲp5 ԥ ՜=^{5\=Ҟ/aPLԍ{8(^g4?p80($@e}%&+Bg\V+5;dr9| N gѝbLZ+Gi2U :W\]bS莇bKW]+K9t8*!:L˴ċEFEUт(ώV MlE O\q6ļ*14@/]X +W 'MhB?j"/腢҂.yPjp>F(+G/rLZ"qQΗpe)ScEf,:vTȟfs> :aFןz U{8q~9kc::U[{KI+d&2g 5Rf 1TmK̘uBD =8b0^w=ךjE&@AZ Tr"jêvĔMГ·vxb'd9Nz}13qE:홁k30'Hu`QkYgOιJ~:ikVP>4JpqdSHD΁FrWO;@֝Z? a*N8mt}ܪ6 WGWmҾ{YOBnѣ2^O.AI'_e4,->5l6;\{ gIQ_z|*S(c.TlS._\Y$f $'ӮZejE! 3-^TT2 q7vEeէug?.DN6`<DED݈#"x_bJԈH!WXSr9hDZhRz+"xDWVGFm` *V@!gQFv>uFnُϫ|$8uJsݼ乸qG\qqǫregdxt l- P]$-Y3f̜X;1pTw<}g<#@}f[ټ~]{g2r`d>'X#_X<||{2g+d@ޥQ,H&QZ9ˏy$Zš (;J +TvVAcoO.ќP94V|Dt&dEʉ$YFepULV r1%SMcc4Sg<_˯K]6w]{L/GMٵ$ֻ:5"t%nO6o} Zwwy5cnqyA j~pO7嵖a:찷twݐ{>J yK{ TN]P-e皫_5mAn6+mm^SVOv 6j7S>~ĮA>dz%™WWvĆ  u=}v?QڎQ%MhPNv56jԮ8(be-`CVՁJMlLK'h:YXڽ[wSJyN]gӋe'~lvzLsҷ]5sY+!Z$r3E"ײ٪ķ)6D-*jWnJtⶡkgЪ> Lh1T-P)RFC6k2$AYo֝g}8n(ȇm|qrܬ)v]1*7RM:) ~ͧg&2wO5A`!@$$Qg%$$''!$$tFYɨiiqPϱj., j H}GN&98aqZ].ǬX#c]I $pDaQn|_i.q(3Ϳ^<^W]xp,mf^ƶͥv7}L` 6_E>߰c`KcKX7,ޯkڻx !] W.% )(xo¬z/ڿ, /rxs&*VCD!M3+2)Xl'G-Q5i#WiS-} \6*kKT"/?ƣWM5_N>Ev0ϼ> rGL~;?)ԽE-/lGFJ-Z֋rR0!0߁PШPZ`$kQAŽjQAТ:ERЭʲM"R|E Ʌ 1CNzd]X Og5Vclh ͷ +KJ9ŭbŌxٵG_wA>+;UjB>MW52ϗ̳ךgS=Ϧ^m(۬ $ $)!VFP}rpYUu+b.fK;I:q5(}HH{jproխ;`tA͖W7ê“='{rn3{$Qf3VH 7Vޖ8*K.E˻Y/r2z=W##< v7ooD#Lqw4;w7-)ЃexJC+~>ҒG=n6'7jӷň}Ue4*iۃXds${zG&-Rضjg"n \Ue\MԾN*?U|\^nz8j{vyyvO7Z76}/%gG^DNA$hce"3!ZAI86(eoGD8 "4_^ #002+CNGa:XKXc\LTe8e" }GAWz3&(:mi* e$ "dq:: b)ޢZ.W0`ΨsAَ3g2|ly> %*Z_9evSa/,v$m-TMV<]/^O!hp1rQQHZ r,sb`xzMh{kllƨco bK?zFuHQoewA;:NvK=@&q#ƃ&aPJ Z SeLE[,!(-`(8z5)!)Sjw=]|/[+Lu$x2kYg|2mY1֏zMp1]aAy*wi4:??gZT?B:-!`>Ebʹ~ܤm *L./q&vqu5!kCk/k9y_6j<6iy.wn ^mͮ?1:úl3&<Cy"sx!g -fŸ4Ԓ"_--9N4opYgO[隷N ,÷}:%K]ǭ;w;]91Z ܥ剄c<-<{cҦKmݹǦ;/g#LXS=d8i=%w[Iv<$3-Te&F&UiTϺ"2qթVSr8kA=NOtX: ~tNkJ9~1}2 ֨AXyy[ĴDʘ9+  Jb%rB#恙?:vB67ƪgcgyhZ~I8D(?o+6bpFۈ%K2ܕao!Y&RTj~cnU7Ԫ;ʞPuJ۩:e^鿹<ڲY$4SJ("Uɜ2`³ jۉB< VIFdTrNʓ=`rȄZ[#@o*D-ABo5O} l' COHy+x>][֒y1e5,?muMsv\bqfWsr6(:dΌC՘ׅcM\1Ċj?:*~(QWAS#[fN(V1,AN1wUR,XQiqx(`QYٰ DU)q֭;"~|0[pԖ tH(\BS"j. SrdV%B1=C߽@}N3+^o~[MXfqTrBsV7痣Nu}@XYPY4k%Ä)HȿPՠsPW2z p^7xҞR}z9π5ɃF:JxnC4yf.Xv&꼠ψH߾MrI䋞̺z/ynu^WWuCY%cj(iTQdBQ<,T!;G'cC\ߞJttSj۬ 9]-S6$ME,O@_.ݚ -d0&Q grүZV=N (EkL+m+GʃtclˀI2v4:i H`(Bd#ɝa-Kϱ+?8X$W w ϻ{f3]?5>}E \YTLVR' l aU7jBV> )FL쟳2Br8"<Ǯ'SҬ]>2A2w$7]RtqsDwr\{uOi6|/Jg}$V=5^DK5/*drُK=EENjK䜢 _4W+S0ku<]/ˌg\λɅhiL3gqJwնs Ej_. N^ެq;sվYK:[TX\ W4 FQ Y|ӆ^9\\`:RՕk;&kJs8# X>}1$UY?^DZG磚t:*vOIo_y|?ǯh #M Dpjm~[O/Kƴ}mMM)ڴYI/Q.gmVyAa򪾍Cۉ@R~~^zBq9mv٬I^q]{JW1cj-8D#nJe,5bɆXh$_GnM<`/wF<6[1~Լw0H5#ʒD WG IAO˰s7,LkY" *d 331YdI*,0Vѣl &F+H*[Q[[u%po>J͉z[:Zd^CP[ɢb^جCYڝiӒ ";gO8[V~/d E%joc. r\R*rHs.dd%EfAGI8g9!Z ̂V"Y1 kIӺmlHa/=*^mSR4-OxŸ) \)nCہ+d}&sW;~, FuâtxD]:8.gJ O'^,s &ڻub.!R]C}*nol*#l )v! 8P}opdJ,hoUs!G{Vm9 (1-R4UٛȢAqP9RS#V[&| ^,ͳF#}}p=4kW{ƀ9 }ӡƏ<]^%ej #58i櫳8w$r!2`- )AGD=WND40:_!Vz^^귷vjՈ4︀dqazQ̫js~N:qz$m[ M? ;qрnǽU%PshEn}Z[Pa=: -a>Ӛ8KL7@ !N]loSDXVcO3G R#&hx{K>Kӯ<997`Ɖ̏kETْ /%B4HJZ \10skK* MBq:=߰_nJ҈dŤVII#Ī)nɱlxOp'5KmON=~NUq+0T~Av̑R;W'XBRQ ŔO&)\=8N+P )U&䂠V]c974sg5qkVi [ӌUPrO3JF2>[Ͳ~f^ ݍ?~nK3}M^! mdMA6yfX%.rY)( @VLL9A I0&j.Krj8s̢-#vk<ءKs.榠vkڱ*jC˨ ;Yr Ί`F)2$'F*#WQ!͈Yicyd2DCXE-PبpĒun]X$-akZmd%ܢә!U.$Rj^;&][o9+B^M sdȒW-;_-ɒmm[N:HDMlVbŇt%CzyScv~Mk xw>N‡4??MJ\SGFIp,b{M3 ǫτW bVd܈%M+e5j0XU1B|坧ӫ<T;! &PZQ`'(4tuCC d{ޗ퀭JS-Ywub[VIzW >zLy||g]qJ6HV.y,5=?8x / e VI0y&)p@A %ό^fU6rI>}׉bX>}&3hhf!؅w'][YpW].T?ؠ?!n_ZWt{^n~l6eE-v'mzP^5Zn_d 1wJ|C<!]:-'=6&F垥v+z3YjSl9^Fϟ%Kc3Rc_/_ 4ׯHt{~+-jV#7NFR 0=CvU-r65MYssPqO0qgI-ģl `2NgP5j5iʣfO"y?x*ˋQsצ W] y:9L[U0^qfak |>5yeȧic|ppN@bs_E|xl;pn%!Y.d[eM(OH0A̘r"j+J!$gD&2q.3۱s3vvtnM t"tx`V7{f覝b.gg6kS~ &tfP"p11<o=z/KD Q+P%b21^jE+iٚN%8nޔ&q:g-Y}6LNk *uIع_7o@+XV^|m6nWgr5ܲ7-8VY&,@6cUʠK#ԖlCiQ1~;;,*ёE!Wi[S-ބPEUd߅:*tDhJ3֦upۮz &c렌ֈD,dIk˄ 4 iR> X:cgG9k7 fBn[)diI /6HDP&rNzaNGLY/]!P;g^9GYԚ9eRgZ"(eLz64hbޣ!P3fNk}|'p<ӻx2i,F)LfMC&  S, oJ^qt848=k2m}MGlӖpZX`Vw1\hehiIQdN&L?%'K[wr诫r>l+|б-wNֈe@c+1Z4VT̒G?`*E^ѝx\+| C zY5Q =jR5M|֐He@Bl\&kxAa)WRJnK.K}@ -cSKkoGrsJdT(<7Jޗ8rr7TJؤDǽ.S)s Y vFBi kr6F(dZGQh uLP>i@j͋kE&76w/LHQ[ev&tΎî!w49=ny-7(K|ʿ- <8Ǩ64&1/\ޅQ…4QrY>i\]Mn_g&[dqőGܐ1P|,.]S"N~.\HD3zFê\⣊ 4Q;ZQKwH0&43Bd)1-Z˩q>Klup%f}hxz6[aWs(6rTZ{ד;JXKO6=y[7 n;ĘƓFG Y,;d$L4"٠-Y?uYIs:g̥+w^8,8bd(UآW""Qޑ~>[Q}7<(9L LJ,M>g$5e!T'Pu5\(Ňt ݀g tlK9iu5.,Q$+ÔAԐD@q!l2{=yr%jEJHzG{~fFhLѶXՙ%x0 q䖎< Xnւ Jxpy=br.;$҉7Rv|RYg|U1NIuYZ9ߪAtJٝOc_pV1J Z # 2yLrh4TwZ o'»@oHhTK&zeQV{f#nǷ}L (} P;U"3"WXEf#9Kcf\i"jޒyas5, ɔ&J%OL;/g`< C. px5xM鐠*/c4hMbl(ð~^|{qjv?ems \mM.%W`Xrki0,;ȐD"[d-z(˒+G1 ÓBL\Kn ^MhIxLLMVNic!i< @9nX_^UMH>v mj=(,BG ZJpY ʑK.%BD¨^mRϬ9^ e Q(`GrJW&+f&#䴏d " =}/IAf8o&h]G*PԵe73H .xEEIf߸7enJ-LBP52Y.i`yr#j>N }0ZdC9.BW q7_8;_8[N/>Q{-3NFV?C?r~Enwcx>.}H`wx1$e6;NNɽ|$2/Fzmke[4؛mFF~[d.+f}IB I uT4E-'i-o8|BgI&hz _B~ټN9Y%:)i%;FSvޕ6#".Y#xX,zbyŠAɲ\G`.lJ`+E1#3/"#XfKn+e=Tzߔx.vG+t@'$4^S֘Ip`9cJ+`5ƫ;4 {ozK*_I|(Ƌo3](:MuYLfJT5ɼBcc^جCZVkGyZb]\r&+`d@kU2\ rif(ѹg nȲ2#' 8FrLZ8 ̂V"ZS2T&aI^A!-4N%*!T -BPYQr.Fsյ~ ARKonJ\3iQ[xŸ) 8U!cJyu/}7RepHdYiC #ؒ!ٔ!r L؃Aa=րΰN2wmy g j8Y_|}Q66J @Tȍ6L5&hP`m}\ DCc63Xd#Dt:O1io`.iQ$:=u= s82::gAap.jfNn''*GsЖAz&Mn'(Ho,oKN>SnԈ>+i!?^Smع9LUP}wY)䙡)$ HhAk''$=$os*~}x!r 95}Rgܮҕl+?ix|1դ8tƣ9+c)@M${=6w }l:~?|L+U}w]O{w9)~ fj#t=Xjcn:r\[AEof肫UN8IҴP. UrVN&/ɓ 넖?izGVJR HsZdgii-unYقWo;00-$e9.PEݵBCoc->&n>[TqaYmP=nMvI(Ԓ7,O|POG}g~3:N0_F]ʹe2{#fU-qa:Hvy:Z_yF5]lr_-0@FPyw\@Xy8&#T" `\T V(gJ+e\@xEt΄ZI笑9n {#t.*bϛ[1]ܶM~LJgAxDBTk\f24ER8sD`xmEZ}c5w,ۡv뀭{HLHK0-]uuTȥKdR)ie6XR }$.22;9rw&]98GV85}̵]8|hV'+%B<-Ҕy 5&#G2^Jhb݄y`50l%x+&/HR;\1l['BMmfJ4M o),ߤ{ s;S6[No?Y%cڹ$Ѕ 5 S>:YӲqV\%Y`D[c![YrA'EL.S97)j\dj[jAS c--Sa5q֩_2XM?kUe8X}oe9x.Ӥ:dX{Ektm!\8B9.y4i$GI.ke4 r3!tҴE$OZpRJP"VEtQe>:i}l`1gLg-L GYho2 6٤\0>1WL֨1,)b_a5=4=< ;Иk6O5nӥQItUϓ(4B;Fo8a2. JdFI뿵ӊB?s_Zesmw{5/q~uv-"ʯ?7d`1&p7V{x^&mhJ۞AI85\4Z`b}+/(Bs279[ퟏ;hxgm~㏇t{Bg/Y*Yn'OG}z[ײ$la<(?Z0\sR_^ #D:Pj.*D<ӍV6Yh.2Y6ADⱈcQG#҈hyo2+m03cDQmR:a"5'?\yΧ2 \gdW 6R3x:œ=~#t#88<4 ~b_?58L'lCE]S_:AJ2^T ;x)z/E/e!N+a I όA`4dau9謈Ie63/\͜j, xz ?u{EM}j*7׾SYk~0; L^tOOs<;0j~o2쿛4w{y=G̼1r>L'=-?ct[s1O&eٯ7zOkj./9wO9\ϯ%cL\|v`/,4 { Ydd$ X 'L*M|w|^L&*{<,&'4-+V悅上ഫ[/@i!emg ƒHmd>3Ww&Α4xl_Nv7ˏ !bw_ٻm7nLd;ZԏW4̭) *d ַ:R36q/)R[[u#نȨ8Mfk~7|͸O¹= YP{}{t4HOcU淆sYjJGQ0 T))ZyA: \Agܣ; >YTM^NؽEuszxwޱDow׏p6Xvt/ґ1/n؀ExX-HVJmvttxw/r/{"oNp222s,bQsfJlW*rq3Q{Am2l c0"H􃵓憎sDj=z9@_.sɭR$ٻ6nl 򥻋a Xt&ضh]֍,y5RR~93lklYJ2Ei!eK!VA3)A8Dc(b v 2!E8:Qeͯ dg qҭ0=gGc`3<T`b X@Rg&wF3onwp.{J'x/"C;"$ 丐&BjB!F%b [R# (WKS)0qZFe) \KLR5mWlxu燔@Lq˲>$+3"Z 9 DrYiX\,P%)TȭRafXTqXDI$W=+L'¨º #VH%JVX2s04DiJW:{/Ǯ2Mّ)"%+V-wQq:د;`CQ; R͡xgkxPp8AQ(&vgJ#)@ÅG INa7@ RD<̊ b$_b}8j{%a( =I1Oߋ!I> >.yZ#"vZL2!#`3N-x0E KA)ȔerJ(~  ΊN[lBp!zr5EKr5LНe4Y0%1~bR`1ŕɊ%ٗ_uY,8-* m ljQ L`+.䣈ƕʜiqEy9Qw'?QXE:^L$yZ-x$< jXJ@`D$`WXҹ1&Un6ĺ֒3I6\!oDʁc&2';"!YkQ;-|`ꟙ%J#[z]7fƗO _n?'2`j)cXy=2쟅Sт}Ŕ0% eZȅFh-{Dx 3w\M"@H6yJ_28m bmPxf>d\(ouvf\P?j3G&{6E0OU1]y?.eXM-bx@#&#a~RKy+wfvK >Zt]WQa{iDTVc$t֦tI,vH&1AI3䮔n=N|fr]1Crv#߫.fMIW 2OG?>] @8br,a19,XBDhH$ ` (i/^Q^1|9! )B%<,ńPK*pm*U\Nc!iv@a)qO]lv[[nc EtP}j[!@%&.*2BxIJE#B v2O%RL 4L";PIFk8>>s5X,PA@AcJ?CIW<l-yX/{5x5ͭ5-hXt)0Vf0kh {V~76CJ,P~r@1*ho6W*Ӏo2;7J=A9}3 f]ՆBqKɼ(mگږl٢Eٚl/yVjڌ53֥dvIZd+Op1\]$Ϝ;W?nB%վoFSѺ([VR QH%w3{Nh{ٕ_$2/'թ+heg &1GM6 ]VW#<}cjO pS+t,pΌtVF]/g' 5M&G 4 g;WAMσ;$KO\'e [a[F$h/aq*Iӌn<\q9],$b)8݄]&~yzR{y\/FdLM1k.3+xEӫʴ.u=#ὋYU`Zy3z[~O=q1l6Ή7 '(Ͳ!~ r׳H.qBhrNzY+I3J;ӕqB,s1Л߲@Dz@u;>Հ6k3H!E6F%Ԑw[''Er.}z69r4׿gbJإ_완:rzHJ{f>oߛ6 p5~ҺIJV Mխfohf? ocRe&]\3P]H Yt SY?wXzUM0u-*IY֐X v4DEa~;>Q>d- Ht\Cn=皵#|mw Q*()= !$DHSr%;v(joFGpxaR[ER HE N&x;tFqrPr%^|)'HA)fD ;`Q`84Ne˭/7Ob+#V;i8;ќf∥P )yV*.иse5*l#B )(밊ƥrkXEd !oyl߆`܆aǗtއ<*mz5ǫ uM_oû˰&:پNNQ?MK6hbVX0JgC6_-m/tU4.UFWƩT'TZ飆s. {l2pdV.˵FwᲯ0\Vh+g?*[uiT[^ˮ׽wxΡxByCkD{Eۯ~MJ I|kxqUkeCW\3/\_ L'stVsdLBʩ^t4%|IE;qC_0X)ݳט" OXK!WiAzӻ~.Yy=v-UO|c`bҙVgpb#Vf49R_u n|ky[Bg( IP||$6د 4M(ڌ-2m|̒|a)a<0|ϋhb=$1HcƤ5ўeiDxw4~ ۣ'' e3:/߆Ѹ|]ĺGf2ɴ)\eI.K"vY˪"vЅ-U5+5=| |}K~AS,)/{ݬ= S^RvSk_\T\\%sUVs+^#\Qpϱ JKr1pj.B;\%+mK+0X0q1p*YTW8R^v 0J})peg &+•J"~޿+bH|c|b4""q9 @0훏{ye0M|x"z҅>#I-^sVZ!2A$3VL:5!%'4Ճ ]BJ2KꓵTBʫô}ɭ_6G1HY8 ә )bvbZɰTͲz~)?)9 K KsD NCcp0JcGrX(艇9jZc!D!e X@C띱Vc&`2b=6MVHKDbPsa}nl-W3\mil-29O筶T76,UU?3I얪Yyգ,Z+-klFӦ#wrllPSwj̀.& z](AS(zxJe6 v+ڼMI+/5WJnhC}o7X>s>lY%..RqP ;?z 3]sM3t_-sLh6 D*2Ȃ#*HԔ1тFQ4`pHnwo͜'^ f4c=Qɜͮ{C44sӟ&߮>L&-J8ie, .F ܡs:3wٹ\gv3'Ja2⢲H "Nޡq[޲;2(J<,3U⭣!:ew{j=N; TTK(F[ ނ* 3BC/hlxfN.fϣڳWgu?f٧IT黎M;}*)zUILO%& Q(/mBYJ'V-9Ub Y9[@_l= yWѹ]Mv|'sާou[{a"?vXH-c}J*HCǧDpCO[,-,xa8F/ T[\vJ-^`F ?[}z})=<[׌w߉\?J%S~FYDdz4L{d!ocKn"#Jьd3k,Z3[='.YNу~D HDGJ9FLFudxRι;.S1G ERȰ,*sGkZ HZ^o͜\1 pGzQ @o/ΓѶj 0B'}A*`/FB%{L {I$k_:TSK{*13"ZpZDx ӳw`-xfVAWe QYqƈ%FPdTO; 볋'CgO#LDf ;"5 F*b3]<5mD2gp紶Bg+}N^+zaDnfi={ȧߧ kaZ2l|tx>}Arv;*O}B_/KGL[%ƈR9ϱ`1PODE"C!w ! S8ƅK50,ńPK*dG ,VQp-6޾ \sU$5)0yw &U;4<,' .\ Z!@%Nt3e:% J" %yi&[ߑZ!QH1tT#0(ES&Fpv $n jjAq B$("jo3E=`SFQpKz7k*MG!mFDv̠͍l/iV_j_8+n"V1 @ *q4HHa+cy;0y@S"BUpL,6'r' Z0&.V0rR@=|Pl]n\Ahr&ӫ Q=M WѸh-Ŧ`N3p%Z1*SL n`/agMH{BI}6'mh~}4o܇> ӱq=p{0=x{p+o0{V1V _NnX4y/J43}kU5[60[ ټ^Gժ& 5oJɞ[%Re Hٖ~< 2+EE%MqˊRhby)DUȫ*w@6Ȯ沫;ϗmΗ~̛ ޳@>a/xsQ&>D0]= }s3Mg}#/# 1Z O$-P2gc&Ȥ&.F">0q>> l)ꡅȭW<׮KgA#!iB'ǸØEE(3bZKɍ++m!,3-}0]c;b WTl`fSDJ4YhDw|zWe nfv~f^ƍW;K/0Ql q۪}.L.uOt4. DO ƺ-sR ͅZڰP d,!2D+>c| tS>+ %#煒=v;Zd+s<0&1$38+*Jt &'IW2Rk@:F&$k Bz0eL3" 40-֝Ah{y=\[G2H6Cz SC'/ݚWbFӻ4Ѧ`^|Gws\d,*PMc{ Qx@9( (gq?DK%C}5aB0γ{A(C6X7yXB]%|C{Ig ’`h@kK %i1h% \Άz?5pԼ{&*EDsʽh %Q,ZEJht9>'2L;F#&2I%U} v?G rST eLHϤM&r:CbP+̆}I %>όW*H"JM;A7!3)IE`ޅ.V(zQp^1riK{Fi%p8V[!P]:e 7t%@&0$))wW>#}@${^JDU[W@=(F#M^cL(HIļAPb?y!$zޫG9X:/E;IȾiW5 z9͎5ќAxI\`=e\L; =&YaM:Uj? P"ͫ0S m5xoM*zJ="&mܖf6x3?WAАdI v Q\irVPHha4\AP\[ ,:W>piԚXtB ;\Iѡ-Mђ(JRٰBi} ӣ5fKt5a$sT`>F噵J8BJĴ a9DH.x@`K0"亅fx҉M\)mA'mp"+dFc0YP؋}>6{ yyʺѮKm!~`!iW^"NDj^DWKrM0Uah$SF*(ZQ,`@)I+RYEf71ɤ|Y ߗf\=(Ҥ?agRSU8L-ImρyF]tKw{JNyY=NV0Y͓V r/VSt ~xu8r|}jZ]T6f4sNGܚ(>1Hn<#~Di5n7޴cjuCěaQ{ 3o~?z~~uɋ_~783uʝn;tN獛mMc{]܈M%z>KIW^ф_o·6 G4W2hӤ^xMtes=Untl揸^QZylLA ($ t~d`v#R2)<4nf[6 ?#4G2c ڲNO{)}ASQjH6sr'y"hBis ^ `/}}ّ>6Mk$(IH*hrF !QH[dn1/%"TE+/Rgϭҙe|[ i5j$IKg(^3: xi+pAk g-*_ x妯/E.P3'WtXÿ1nz\#Qn~a*<Ҋk P/\KiAq7 =gC98nSsx@C*\c * _z7jy1,A<&Ec:55j~Ukj*ct t;[3d4m16PǪMC|.W ^+ <;w~?=V^u=Q:1`$VKQc 'F Q~ ~4s彴 9xVc7e {G3! Pd{n3,O"q8&L#stǍ41^!u4qZ0-xA 1 B2*Z y׆|׫"} g yFlzjeDhêy7rcAȺO~7wњh+}N| u/b^k0l).bɅ@-EZpM+#'eT]@*( ͚xԬSEl<9l]#JTwO-շoF?.!<_vY:2"x핪] BYf:HPI \kn.ta:/Gɏ?.lT~8/Jw_3|>Nvくڬ;ȭ_/!ԤvEvof I=nwU;_d+Iu-\#/ qjj.5RtVi)$*6&oVs;>+èY:lj7pɭBOpu(r6շq~VO;^ٛ?,MummkZi)>SJm4qv^3/?TQ$@h-X>q$:I?MP(AI T8c{z12 rmwh1UFJX8˦9$%hW\[/, !NOPE#DF /IK@WEn%H/dLt@!4&zDuq6v>Qn]OB[2SQv3eO(i=MT+À` 6R 3oBM3eb&syt.f""1\ K2Rf+ >e/֫}{#S[<J|ŜCseW0@DZ1:)u,Z={]{|NM\ՓAR-Uat*b:p1G}21noS '藨.QaQ:vt@-ovPKy:+PVΘZ"kDJHa$z.\<[&oN;v!=(0Q*"pf-rF2bAYX6}.&c~4wa{ eEhV sƩm\w{9{u ϧ.=S1.փ$-3:> djht<5[ZmG@i|N $5l"[{md>5%}r׋bwJ,Fu:%*EKB8i),Yk% ೯b}ҍ{9f$;¶>?>5G?eA=*gO7 )Ѳ݋ζGZO8ՒGZ6˻aF9؞ֳ3Qe("ՎP athTu&Ti{8]zhL~G]L."b*fg9ɝ)BJ2{'!e+FkuL } eh5KY)* [;5.kgO;{fCY赯Iy)9PI!Q+Ju 4h!D˅ _NUQaBY ڲ\D*Ԁdd8.i-c)D3.1TyvgvKO'*9RaObreRV5۶h6j!iguG5)֔Ɠ]1NW?lON3. +%\0:OG8D! 1UJe)S{ʘ8ehz `zDp;Jp TiX;-Ub-m!mE.32>Zݥd4̩|vp0}J=4GM*IJli]ms"9+tw15K*%׻ٻmobBmip*2Skb TY*eI̐X"]6x.i{8.:2`UH#|@rQs5\쌏>H1ȓ$B %ls"~I=yR>uV%Elq][̱`gNL Gd!dL2IYKB첐r簋)TWkvkue{g0akmH5n8m:|:%%=v A sg?)@{< Q¹N/xJ73Bs;(g`q\ 4 )Kn|L.趒{¢p 32qf>;HgiaRPj.O 58dqU8Zt\D>h ȘB.FZf ])s0z5ӖOϠ};x2:{2+.e\C+:nT'qR8Ixɕ-PuTW).Yz5Wϓuzhe3̕^qOжT,m<^_ܵ3<2i6̍܎$5lԈ bz:k:5٣k2r yWF>/pc1n]AyMh 0Y.qc6Z蠙!g*61i*VaJ>Xyш%DJ"9f邆)ÊM ][mF+oP|LlBFI 4I%;{A(EVS4L&*;,Ғ<`@f `q|\EŔhvTpaqjP\,8>b<ۛ=~ ,탏wYYy/}~`6 e1W<O{U鷈X$2 &q/1*n_ ٓQL9Gk' $g gq()UvHH5qk8 l:]]?{u=S, si)z)3bV!Atx d%Fip!BB]H;zBB/\H%sAI2-^ <<:MW7NR-9E(\8A>rּ*:#8PѓZq#MjӁ}G3X|]ÍَxSZ (al1`l +m$Im^Zt}z9ө@d deQ#v {pB$riԞlfq@bw^K8/qsHĜDk#,'PH$rtsɍ Wߣݼ^7E;60eJ耣"Y'_uVw.Һw2u:(Ѽqc20.Q+b# 8< ^ߞp[=&(Ym'Y},j C Pl"D,> 4:d0 ^D2XRJ#'uWng ZF:kk3p̐* X[—! 5\U0{le-fN3vq8sfN}ܸY}5,x(Ep>> ^-^ݻAȽ s|Qkn]B.~2I `ݻԺͲi8'޷בֿG~hRˆZKtvCÝtyPXwt=wjn0'g-Q9]Y]4"68aOR*=ؔR"j<S8*Hz05`ypod49Un a\8WףI>:0V :ξn ¥%7%[ҤeY/ !폣YZXK W7u!:]hGۣ@(a]օ5t >u ̯W-LÃ9cWߦv˸,k}ٯ9VlAldpjW9NmEaƖ Zz/L&Gd:_BӾnRDnRB48zE0-#; n|u;-v[C+P?iᖯ 7KWy?E4~h/Cyv\Qӯf⒤mג;|@o܍c˵52ɻV* ;hl?[yZ'9_ WIг%hцWFҍ[UfƵҪYs ؞W%gPϪَpV}~~ݥI6JѧNQ[/rcWÎقeAfŁG FRN`o"ЪEZzw P;6{|ϗdG_vF[ǨuϞ;uw7i56c7Qq@g";%shO4,&QAe|N%]wk9dURD>lW9oYP 3g? 2D#+$ tWar`,D`VkBe\^Wf`vDWmyew^7#/|Kogୈ {o^q&=U yy.+RƢYe99F0A52֠zwRs|`)Xj ٣\R.يR7v;9 #9, LΊș6`J T3GX#ÞOD}L{Mο<f&IOƓ1BhrB} <8ԒWV0-6G7;ƶ;Bd< tbɀnLo uc˄XI]3A HZ[&1fJ$m}YYYϪQZlgХ.*!9^%mPlS^K.#3Q0_4 e; #$. 2,G >,mT2兲لEIbsԪbxLJL-*8CBC ѷh8*WG{đG{=ct|wY-NKI*n, +bԹ,hhi)AH*'YߊK+n%=͹z)rU>Wgsi󊘨"p!yb# ,q B,XrPzqA;͐iǻ5mԤiܳ{O@?.I@iD 2dhduɚ~UZivg%J\1yM24 m<7Ig3F@@& (gx<ڨ4m+?]qR)s%2PL(Lpڊ4l40A'8Ø{mU2sA*#ּJsM;㌗R%C(iAQ[ejZt,p?,DzգYFd1"WJH&!, ,17 P2ErR9$tNo4a|7'_]{oG*$}`c'[`>eegHMҼW0p:R")9IE$s2JaQiƝVQ`@M.6#YmVy]xq5>\8gKT1s{ݳzl۹"`qg -n/.V5CV7Cf]HJG @n^t:GWJV\ꪾbꤚU2RХODrXcŠ&bGc~}ps`~髓_^_xuz/`F`, U$h3 Fށ{MɛM:ciT&M˜ xv1m+fkk%@~(\}~/#RrLAӟ*5l@WR}_+N|kVT)!n]J(lI1[@wޒ^*1٩>q F@u vOW}%;GJQ&?ܬσ@/1~FKe#;,SrÈXj@"Ō9S`3F^֨{ݑ~>0a+4c#h"+t憧 u(jBr6J9zU N^VK0`o8VDotR^-? f 3-dJK)FDoj._> :d}T; Yz,0k gNn*MC;ٹgF|7_LR W\rLjk`͏`ci`YLBˈtةIs̍![XYf6 T_D\ĕ:+)jNSDjAt$%jRa)qn\JN8l!lu"V=!<UX-zg՘IDk j'zl5@BZ"2vf#Z}H>sBCMDP:a4D(cvF'saRr/"=m0S.[CR ]QQQYI(saaREbvwKENg4`iI?*d˄{$sVmyxxSpqպ0}O+uQ /%Gh+iܿ0^ ngQ# LuE9m=wpčkMESY :G yyNh-R"| N9ťoW12#XP1g&ʭ B;VW=gl if]H[]W]8(m||L*M\~@izu;J *[J4SRq`j9LY(Ł* C2'ZK(Ij yIѤ>:"*Ffm}?gbjmYkVkvm T"PJ"bPXR.#`:Ő7q~LU3$V  D91](en904&>Ff}X I1FjDY#V#qKʘ$h,lE0;F 8Q`LTR'jD0˒m8XG!1)D A94: $$ivL8*'uH{ٸd[(2EV/-cbX +h"![TPk+S,'pE ^܇^܃ %|Xh.5rn8?؄Lx}Bڍrzlz?>#E\y?NfMˋ> Q d& 86HwOn+d>7eO>O~\/H0^FcD2Y+냉u"hA #(H8Hq*ۡth) eNxi}y;܆r>ZYK0X6a^!X&HSE57V"FMq鄷srM9hξ ,[2mk< #jo򰄏0e&FĂ-EǸ.Gq1*XbҥVGF l#t)p\(wp)BSϘ8P=d:F s4‚SK,縯 N~u"kҝwc$Km}v 2Cϟ d\2L>Ցaxuo&(sB# ?V0' g\T? n\7&pMߥK >@fTʠL/U\t*n*iB5yo>Nvz dی'UқCդja O8L+R|'\9xCxUk99 S4'lwemArIDЎ֟bHɞ $ѮP ?ȥ@DJPj<% + >u}D%᭺&Օ@Z,1K-M'W .fޖF\2kT aeΠFm2;ݨ[ ϧ0mmvX0MMQPքR`biK4*u* Fy`2D=6YMM--l4(B#e;om\ U"qU n  q7'N<)l-,;;,)/efwӁXogdo&'k Hϝw>yVa,1juֲt8gW?:E=R7x{i5 sP}+|Kjq޹Z^(gôǦ=e+Wl[7y_^ncX>]B06 kTqf Sgll4V0 7B/,]xxRvV,v/͕`;Fn=lc2!F&1Dk06:4-4&-8y$il'LcN!Yec c>xnϮ 1[֭衎 ƥ3~q{f֎}Iwٻ8o,W َeuy&Ӌ7lzh9"KI<@VJeEUֵR!yyxȩw;ϱ?k1Å#{k.So>W9/3Wt߿̟5(1o1SdRއ;eds;|Koo׹MB|̇:/μxUD2 x,"Xsvc/%-'?-f`y /ɫ?ߥws25=?!zf]OΣ}4[7hKJke+Eaqƶ{%M(yv+㺅iBђRٛfq~v~iEo`pn~G,[ nwVn'ܬ^!ɯVᾸH5%>B[oX"a`~[xތijؗ|?J@)mFŧ2hyG|9DzmLœG jKWyL1vwmXB!tFꑗ 2A:>j!\ZG@gpIJ%7z@bay\`!-< \>gn4FUdo!:eZJ']G8=8)ILWO+OCס'tj?s^c; `K`jew(tDt5P't CWfگuP(t 2sg>?j$!~ZyUr:8uQۻ79x}jT"ZHW(՜vvE< CW.Ck4^]j(vS5#,~;>9YiOƝ W6M;ߝ:GӶ/)G^ؒ>r|i#>9Xmawpdo}ŭM{jJ^g{D޻-q/֭Y_#u¸#> ɰܣ>Űqry~FL o_z)̔ .!Vr8y)α%B%Yk(}5&kV妳x>$NG!5krg">h}Юq?=o4~柣Bim_uɛtQ*kt-XȚz}&hӕb ovYjhrNfU5PbJU!2)jJ[jYUs?a5FVwwBC֗Z*BE:Vf CaI]eYkɡC$ZV&8?'Zn-5AtlblsM4FS)){OEQ>5ǓSHXꝨ<,֊]5 kJHMԔ1bN",=XzBɌaxhT|D\L5C9+KNIy^၈fyzMN4YWF@ۘR:WhmPgH3d*CdtCQi؇K@cVMɱG=7*>##G( < 4BQELiORi7YhSȢx ,s"rvf/>a.S>>7'!1frwTRLs!+F[uރz9K`zzsaN>,QSG[tj֥"8:AOkc0m€ڈҒ GՐR AFU!Bb7o|m25&b>iyRlpr Q]8Ec=u50sS`̙=HT'*Ht=) ;*D{j%xڐ]jy;aF%F) Hhr>:0`gHc{5PQAm>` Z 4@sXmmValvPg %˭Tb|ut%>,d.V kBhcnue/y ,&JCqkfC#guc]ٙ^џ\ l[(-•q($XE4ozdZ{jӁ9TMAPq5 V.dJ)s'*U!\ "U;0VjrȜ̆eBCl0(\`AV4*(k@oJ'm4_P^j+$_hl&!L6@@;-V(!ȮhIc@CnTzCZȸALAAX'-P C='X !a@YPєDӾbTy>MFݚa$$X^]g`&b18WLqq)q$8TR 3rU UKx+9oL6v_ J LEwfR )ti7 U Bݬ=JED)J6NBUu3R1 E"YUDI)bu+էqM`3/ʦ9HH.yAIHP 6A Z#b C$7vVH&И}AՊXZ}FФi!:3x@~BAŌTUűb]Dr|bLbcUt8!bEJ`v&qTN&홨;|y] nUAz|&˪tڦL[oT^Ŏ"TYǪd 6̐t%C RGU!ˈc*JdNԳ֞fR8PCXm(6tqЭ_wE< \T*Ne 1jłrwDL0r( ;fŠY 59.x* iPbٹ:H.(積*?!tOW,"坫3!GR cxklgg뭀띶3q8#T(V *if'~iꜪSG* ;SdZ>?/؛Xx vRΡAŒNNƎZźIo ㋣N//?ݎdF6\Njw[z!ߍϱ{.;]\ݯڽ!bT\~kusu~ mg].~M'h㪭ƻ@I=4~JG*wKN  D\ N zN J@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $Nb_Ҫy@p@(B 4'ktEZ@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N㚍Cr0q`@ql A@ D YĉH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 j@$9  4Z>'@K{'@8^E $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@zMN?o7;kw?իmwsfCvz]^bK  6Ľ7.VJq5n =G@t@ƵƵWIW9uw6FX7Fɳ L1\#~WwB3nhIv?Û(,+~}BķJro?Y4ɪjtZ﫹lT"괾E+U/)&(^w7zq#BX`X!adžicg)TwˆFQ)(5+6:^VF|9V7C{Xqi§;..Ny7y׿¨ʼ |d{̈WR,K(7A&坤9aHI;"` WRt,8Q$ttt+Դ%W] ]Y +0T˙DFId@KI QrgCWz˦!j'#np:.]֐n(i]JwtoSA_+Gt-AthU Qj [ b R^ ]!ZzBvtutŅ!$BCWטR RNWڎNp']`Mi1tpY1sWNvBwtut%%WvlQWR jNWNԔ0;znp-m "JIҕԴsy(8QK9ׄ(W#7./پޚ2*_gybd4He TFz!rVo;쐊)?Qb߽&~_RxWM`C$Ťozi.^/(-XhYء2< ҠCй+dNsW\@l奷UfD09WCĻV>wge#ۨ% C&V NbeU09,nsH#C;_e1ɟTs_ gUe]n`r L}#%M~"{I0HAt).#ń'!Z&NWm+;:Mo ?qn<2]yrىPҖ;ЕjߦDC +L),6m+DieGW'HWK+l ++y)th5o;]!J)S]q*+D\BWXkUvC;:ET +,ˡ+qm+DiiGW'HWdAt 4] ])Ƙ&欜+++D+D٩Ӥ+2ݜL^ y &eo\^}5eĖ9NB92<` O/q?ZYn?sڅ8!gnt%ayg-ߜj3+KxJb<9h5o}'T])zrinU'h ?W0Df.FۍPIEY4G|ЍsޜM>rۏ?kP @΍ ɸ ^L^;+f_}7{EB^>Q-я_/k2{Bn=Mn:yg/\?oZxmve1O:#/z>hmZ+ifmXL/uyۿ-K ~501dɞ$IRDBRr8tTbnT:x?'WDsJ> #}9;FEL1VzcbJm e~[1DŽuh)df0fg2~^;j}pCM5}5M<CzP+\Xpv M`un |stE<30B]vl3=2Nr2FwTxhj=ӻ%x[X,}]M7.9?j/@ 5~E [-}e$rP N= 7A]X 5k l2 01?Q}襠 +9TVG{̵,D"ڍڥWPs?^Q,/}Ozz욛e3[Jx3*;-|k9FEo$_&]T3)B>`]Ear.`mktV ή ݧ`{Ұ7pxcϽwKy^I=yn80\mم\# MXO(*ǬMN,Y6/&,~4&AՇ&+ ZFȩgNRIRIEV4$jKE4^l8^&QS$E73DN2"Fxr8]][BAv`|R=W 1s:ͺmYf5{` gL2xM)5R2sFg6:AI kU.*- KC$ǔ!s%Y4ҥhógyy, nqf4臻m4ڜF@$531anmq͏mx<,yEIz$cgtB^bj*H.ffEѾB\&.%F 3xeRH)hԖhI W6H6r=EJ4F 2ʬJZO3ʰH0Ly^a 酞bbpZP& ^XDDj 7L[+paFge),#l](WaT5ՍFcAqoPbG獋 Brqb "dRN8iI:fXnU>;w6K| eTs)̮z0Q}=I+7YqrͼO=I[VNIMj ' 4 ]Y},<+bua*%%&.CY!*(4j"&9S)2j1hcϮ-39m׉C:g}cXxzp-~m#|'$"\!2×l~w{hH˻x ֫޶BouvopSZQZWZ}*g\nPa.'.PYfʹȕc4TB]9U} YzA<tMMCEꁘb*fgGab*rʥ҅5h wJ ihNɥdfoM)zAyoLbh'Gw;PJ{>j Ζ_p6|5DzW64+^-ӯ]WQ"^ !ZQm693/{$M5?!kVdK,0F< "XeL颎Z2{x`pqT&l2:ǘ'[Sԓ F꘭gѥhrNR)\jbƴJc\ؘe˅a.*.aݦ/C?N8y/vAчA2J M Q´_$L (GwJHmW+p5x&elnxs= i+JDl|;blf&aU)MÌۏql k7fhEڝ}vхNFkL'JDj.d-X QH|.x#!nx51Xb&(Qp!H8I;06e>z\01DŽFl>eD0#ʎ;F|u4${GY YEH$䜄8m>ьk)ʅu(A5ʈ#="{îTj]nƣwb0/[̨&"eNeݖ խbN OD1ebw^)hU0ma ̹-=48elQnŽ5s9ODL)eɀ[ sít^])sT68LR9܂^?w O=mɅakѸ ``{W[bFp[<9ŕ wn+"^Eۗꝴhij-r$ A1ȕ01fHH3\wmm$ `GE6^ﱑ=u<$W1E2J^98=sE␔4(y ؖ8ꞮꯪP,høL(Pʍ"6UCз, Ƹ#aREbD h$]uv:_Я,a:*y*oÁ_ r`;񖤜+y| &5'~1u :[-N}JgT_S!?qJ_.2a.qM,DNs5.g AX)E7тE<&8_-tՁ|:z">P.:w!7G+i&fwIw`r#k}e)|bݶtYGu1>ZкjW7[lv vle-n;(/ܼzٵcet{Eu׼ k GF=H"4ǜ)su!I{۽vW)h1`|D`†$]b\`&0A6h=GjMydN!/~ ;d:;ryi;]7C0\.6K8~_N9aR>8:-x4wT T< Rza٪N%؁n&`(3*FۨQ1,%!ԭu3vKZ 8ﱭًrd9)*Sݮ-g,SÖMohglN F{(` LHƘ4R D&$;p Ȅd Md%)ׅEuc2ށAQze6ﳎhp)(;0@>0,m=3{ 0 ^NB\(]J/|`KY=W]~p,zݾal[0 K7Un@LW JNzty xOɴB0rG;BF7yL#6ǹ)%VU51{ HaYzp@A3T?a1T[זik4I)-J=.x젉1H#Drr%Ng!,L>!-1ף[4#_zUCX@&`Dl`m*%$Idy'DKc1*Df:-KAܹthzq-C8𼀕9N̆ŋ3kaV t{,M>}{,8PW1 z0bQ9T+kS,`@6\WxJ6‡F̏O~D۫"yqhhO03A@6"$ R%h" `ֈQ0-VjԈN,Q0!S&aJ;f2 Zb"G`Pgx0_c#Y,v'N.Lw5umǻݣ^ņ;n%xײB\;'͈Tl҉Dk9' BH$w( U ꘦ -2:_Z}K6pvFw`"%1RjPfExJb,A"A,>vS < *QRi5a/+ºS:F*wTkP[aY0a!JhDHGPS_ϲW)PI԰W-wс,"@rDZ#u3,x/H ҃JwwWJ2w/ʈav~9#)@^ rINKޱ9lLb$c}N0ŸNYys<)}Oϗ}˞Vأf*RLv9 ::RLݑ)pi7G Q~ 8M5!zv5E v LNYZEa4Kc,Nnl&RԳ3WjEP{QK%0₎NˆtDK e}{l!azRa4[S:'7#Z~|(|5?x 0#sf.Q/z1K"Xjfewf4aeĖ}'t in'DFRG YL;x4i&X96 J^A6V,'^9,SH_b<gjRl *`M"6U37\r_tާ?}pOgw~wf`l Gm,hh~{ sbt-u=;QCnHD- ځﭥQz|#àp)릩\CŮY]_ 8Ȍ|曯 UADՏRB,]R(m~4>J0FiMܯ:~Duۘ-|#Loz;%Ѥ <1:~hT`#)EY?5OmuFx.,{2r+ #Zc138X fۉ(~ʑPF>jm鑠x o8ˍ40wA>^FE>_ٝ4" zB1׼{yzcy)'Y2rYj+fmEڳr)xyI.]= ~qgKV͛c 25M+l lDmYXkqtNU?Q7IY4ԃ0)>Y)`Vo4R)}wg*߼P7Kf,,_'Ivo'+sFOџI=^+퍔g{`K0lr$oTqS¥oH_ɯ@O=7,6pT{KmFmnL5[.>1KiQeVۗ74L}6~7)&_TD% 5غÒĽN^%?mb tx*g:"p @rOLd96Azlt+cNA /ೣ"s)tHiB'ǸLrp!4 Rrcʄ^u tЂ6R³uZ7]M޶<_IOw͉&<#tlU\d>(YE (Ϛ<ͪ5j:**{d~UwI_!.V{wb`w9֎,j%'Id9bqduwU,eP~ٺYژ]ͷyq5N_\/7/Tǧ:tO*E.q9SF# ŔTQ!3G[C mhWca95zEyD$^`$v $ ^p(8ǵ!Q %%c88/LV@i6( d $t2#$ 86MyôĐN|ԒUsk1 BAsq,TD~rLm8gD[:5AG/ZԔ<NڜVy`Fm=,`̧24qCJ@c!y4gkg,w-ј:NjrVg&zmAP$/}ߖDʒ[%$q 1_ Rʱ>vn܁%9nn܁p>jnahFGɍ;%iWnCUeܸt=Y\w\EBWVbvJ()P-]`Ug*eV2uAL: &Zv-%tPNRV &+tŘDUBIOW'HWe.]%vĝZNW =] ] N+UwJp^u8x]%LʞNjgV4Uy (~+qLڨj8^3q:B4_~l^oўyh tN 1<#3G8#QLD+맟\o 0/Iu(AL->pS 9-+!Tz m P4]LFV ڄD_[jaJz+Pli\ReR:6NA_f dh 8PtN.}O;7$3߼@~CBSWUXg*C:$/ ϧឮNuШ3tpKW*Jz `+ut:0Fq0Ha(Y˂A|]ឮX(!i K";CW .]UB)HOW'HWD6,/9mWNTUHg JhI*ܚt)$$]_ ]%w]%NRvɻU:3՞оRUB)yOW'HWBX] %uf=tPIҕiL[^xg %JE/dliXf);S>B#~s*C+S;ӾՌ1|6Ϭ~za.IX tzu) :W2:?TQAYtqW Q+R3E9AZaL>gf}Ͼl4N^r# $g\Kr1 V[2ŇA/?/n"8h>57[}<tw?$-ɞC'Ү[FR?e1,{-`7UFYa2Upp>?Jthiy@$[S3A\]*hGG4*CKy8;ED:DW Wu& RNW %=]}5tEzz(!_(]Gj)0m;@W'w}:i3RMe^t8>"%,`xeC8~1Lͥf6Na9 52K ;GJQ&Ûϔ%r1ggߚ)+TkQ̲o_YOߚE;]ᕽ12 >ϕ`H"#,U"r2I3BSFל [(ӈ=Kg4w7gd@UmvleUf|2p(u/!lQT*eiYd3TUJ[- }\Kq+A|PoG˱/I?(Frr:7XlGS/ߥ䍤; u?΁+k _Y*Q _Elk[To)4Wx,~=/k|gҀ>Z+խݱd[# !leX;‡:fcbK׊`mZb4}ۈ(,0a 8[JxY_?\oE}n_3 87 x$ )[X1c䵌F l5ͭ]0[=2_"éRry|ţ%(D}iQY-ҟ4]l^X>z9P?GuKʌ-g"]xZ|~]zA܎f~bm\w;/,^[=׻-p'~&]̖Lh{7- ),ܚ@(Cɼ>Լ$o ?.vbUa}VwMc:JcGIUC?N'x]Bx};a0v|<^jê7QbL[aITe/7/̯.V=79(gD4Nޯ`Mҡg3ׯJm>ԲgPs[_\:ˇ:í :qew%tTS7YᠽVYL҄oŧ*)Mc>^^_x{!a1Cry_<Ғ44]U\.`HŶ4z;wq}H/e]U^"<53Ǯ2jXtʹf3&BTg 3>O1#[G7wy]yo\6b`#ӜSs(s8D ('Q>vc>voi쮬2Dh5"4k!)l L sZhhN rH# be}0z)#"b1h#2&"p=c5.͋_C(||U]9ϛvxssh+5ukD,U0`Q%TH'-Q# TxzӡLp0$ XZYFGðqa#n F&iGؼcNֱSvU@uDvWЂo ŢӦ{!*}$DݍYu]X١Sљ n5 m_PJޯ~?`K 7ix5dze&SWb]o9W{wH[|? L 0k|ںȒW-%q_l%YjJҙ#wS$]U*~(aN(Esfͭ ~Z3!^kkkQʒ=9'>%b\81(ҁXJt3+l$I>$^/x^d^ﰅoƳ:Mh; ][㶻`H"5>~\TP80 Qbl~3߁c5 J{QXsK1*=cq#b}t8tD8t8=21px*CIDtcp}^(a\ɜ؉:e D$&lS@lS[Z~ķ Kmw1ŔsuvK~{gрmP9 IL>}|gCOgCOgCOgX%Æ+Pf"a}D,H?=(H=ȣ;q:`4+ |Oj-JdY}|ʟ%,s{ (f:1I"WL SP\XTkֲWN޾~"قZ-thlObݿgo4E?F VSi1gҚhr0lw}<KPppv]ki(LVxQܑ6%i<!f`e=MDl39FHgmLbPDa-ERoBzMHtL2Unc4= 8Ҕۼa[_O/KGL[%ƈR9ϱ`1P' EC" "| sd0 Cp kB-U«Z Qe>I"mlXwwWO]px"Uv@G(P7m|j[!@%N_3e:% J %yiLK;<ߛ!J\1F5TzN$G ^Po0,`kB` ioybw%X&h<浞  HX/P 3eE:g"{`SQtpwz-z۵h]. mԃ#T;ّԅO \_ܨ͸&b DF) ao$u^h[1</W,j86w9B( ZP 7;a0ׂ@xXjCz_V/CS\^̵axpbFe0*LAR@eDfM¦`6`mikhI#٥Qfe¸hOEe8 ,m2 WOf֐;5LFCwsC;P\wi໴6Rgwx1VeխvĴsJ˸([okjn-;ۼ^}۝j.]h*%۷JR$KX| Dعv[E,*}T׺(A5Al$*f#V66dqZ{Ȳpq|ٕ_|IuR3 y =_ė`Lʳf@]?4I1b]|ԅsRX>^Yn%w(Lgƪwz(*HP2MNַָ4;WA[,1ͯOK-վ졥Eo>> iFyX.j*eM'KU^P Ds4cyԓR\VQUZb֖SP⩓(xP%Y5.2}O|K3Q$?hMGꇾ4 ݸ%]޼pVⷋHQi^$Eb m)=՟23_߆SZFt3MZln2C;Y.7PG-88*nY-gUe9؂tZupuņՙU⭵CqeIxpP]𑻔fi;k30@u R)o]c2E%&(HёP,xt%_ӦNt& 'N# 1Z 8B!HJP2gc&(*F">0q@d82:@nRz q:h$":M(|rK14h/! L3Pi-%7 L^7F/Iy:uyL6ne^Ι^]Cd£U|-%AOt,3BR3;uglE`W0 ch%uN&9_yi:-4>4ǷTnmIҤhYT %Qj5Qzf !9ȩʨ ^@ۙ!]BG JLnjB&6JjA߂ B"hd0u<N0!=E %B# bB+DF)59bHR aEFӎg^}ԬnԴ{ tĩ9pj9k:`o1R%m"RDAdQ]c~׆z;aR&B(L l)NhϣFrNDbLFzW3WF}I =%>K!UA3)A8gwF?((o˄p^g2B.!t ew 4Vp[㽌i ČO`Ǡ"cjl ?xF6G!8!1'Q Dž40$VaGґ RsOF$t (WKSD0pZFe)A0[/5^#ϺλΎ!,R]П\ Z^C[0>(a3뽰Dk9'\БHP@;1M'.Nb[t\h|EBF\;LOl2mD^G*#R"%1dNF03,*͸JB0  (B' Llߏq{>{ƩG`؏u]R0* Tr<֠<² aDC6Lш`t%_b/"TSNF@ n 9&,FXxƐĊb`єQpH8fp"qfGV}r C(dp~6J)@%'Gfߓl"̲glb$ٿ:f ;M9bst?)sO#պQβ;dVf+Rr Aă)"'B J1u'(•ݜ30W'a^/ K0u!NX)UX]MѲd-㋒c.g%g$.-2S7nkSz`d_5t0^qzNNɦ{F E4}s%L!aBWJg٭gKgUf_tOe}s3*̕2[c(Q:|/x10~ji<ݖ_U[3ZY!1'LkN'M3Ųph{mnyȶV۞ Ujikhte2uD$7Yʈ &FvT *ۙ^_8?8o?ozu?_x:I[4? >׿ܴ(hkҴߡ]|v祐Dte@;tT Jϲ__~K72hSRC/fMJz3|t,*kRB,]҄(!s:# Ĭ@wsjadQ%JFx.,jL~F$Rc)fq/[gf@3;­7 qcQzc#h"+tZ ORWJ_da0wgy+y\ZW0QɰZ`j4D yƒ&PT:yWW/*R*E:)># E:p5a@,txuW'|Xo5}6zɫ݅E3g([)yA,lQ]^V`/k\ \(:p!MoXALi : ⴗ# /e{5Pچ sf9G_SwVU~ڜT64мnVFߙͧ2@/ﲊw^-Bg0jAo=8@Ɩ9L')ZOz%r)@1;kHE>ג)E 0ۡf@g r.r;02=nR;,+H |rVC1[,!3V0+r]}_l9,έ\_Em"Cd_r uXߎ9A{x,[Vջg*Ȋe6G;|*;9fN.d6r眓> B6uo~hilEfK8nbfSc~^; h[|4*\Z #yգtWpÆŪ3VsBĎ=Gjc7uV:XNn``.lڙYM؅26Rum:[[ _6&xR瀠rh= R\r;b3S(YCy~~mQ9qpVO"%Қb.Q#9MsӢd_UAƅDާ]!u#-Wș)BrrhО%V5*crI xME鑀kH-+!ڵ".\W*X!TK2hC&[{\ u7DZ?u3I@9DU9aVp c"sYi~i),;X{Ĭlȧ0Ω*F}[8N Q MC&|>$]$dW[<=@R^ L YIܺ4- t}몗7Emi5ӕ٭/n|8Nс#Vz-AoHW9ůE^šIH{qtI.R]դcB]Kz h kh6i IiXt么r g-++bPLs1]+V+ggj^AzfWl3Ԝ4ܻw}oHw]`vf3fo7#یz 1IXk T(p5Ar&)lhg#<~ vu~Tv2XէB|Oо%hU0 qڐs6RΩb^.ї-;~[ZnZy0O>NdϳtEj^|oߐ*h}}p좼x_oԦnM?o20*oQW^V'&3%IQ.V 19F񖀊w{eWs1[`s5=%qbrs(kJ`M U 7{V|AeB3UYRvyxqfۯi_\Ѕ/gwU:QIaѶt*n^O`Z8*QI2N"S ^mTЂ21j6ubRUr3{;j;Ĭ1 ;Q$H7X+krh"ءGrM%NɱF #•ţUkXpԒ:J;M<#<`7yrWMVCUSiԄw+ܲD$M$W갸M9Pn*  Wv=XP~Lp,WMCUS9 Wo+mD58\5vj*N:B\шp%hphvUSn &7"\5vUT (6n۾bpͩ۷-LN^dc﷗ls v <0G7Xohrq4KM- Mc ̘`֣UXp%jkgWM2W^p%<&лvSkqTpnpE[v=(޻I3vKNM;Кn*2h\ф}/50\5CUSIS0xƫ1 f;\\Pj,jj5 WM4wu"cy$WM Z㊌Sj WHcWMO0 Zo+Qi1a՛JSކ?3:5| xN&7nۡmǐ^eyJt  @n c/A䲲cZ=5rh 6~Yp%QW"FZPsʵڞ1: =kĴ?_g.Vڮ-iXHpk̓p>-/o-}w:~9/by#M2E_,kcg{O?^MyG_OK8?. DuѾ__?Inyw{wu#Ʋе9$7muѨPg y !}zOfaş}ْyuCz'qxo[]')+p?{jh^} GwkgՋۙ(˸|Sm0i8F޸ǽy?5ILg ݣ>!.댼iso6]mFLJt'r)hї+`&,`GkT,dZ`kw{_F샖@Mxv+?&|K'͐.s^.kj5:эNEX6EZ &䙬^뮽>st^%/Fe-Q!+yiBYlt5raNeҾ*}fi#$PLVcSJUY BysgXR;.PU$m(7DoTAiFۧZWJ(E,TKK&$fd*! uQۍjۆ] 9]]\|j-JrƺJ @9̺R@&'EšR{bZF=䲅-='![,X ,>fd5G0aR}M1e]vhcFzX)yC!;1hV b'm θv*2XbJ 1V*!M,sLE:ZFkeW.fKIIkť6`Vbb u%!< p{>|dŎCHKk>G,Lj>bc-c4銔"kE:-bDយ=ku8ks*ÔxXkQJNiB0QQ~B_+ƢvcNpPR<P+Q8dms)T.WPf4X'餞.]FT,CD ɞ5)0sS@w"R V&Y)dW(! Q S׆RӌT"yˌwh|z,`H9egGtm{U`Qn< p-Ρ]K68U& 'M&jɭZR ULl^vNRB"_ {[`V9vL0z.z97sZ:?tjV]@!00@ƛԁq@xc#XYd :AJrhrxL Ρ&`2 ȃVsHV **e^i C.k߀˴?Ӯe;`#e,m^5 Dztqu*ip$ti#ml%f{'QEŨն[SQ[k Q^y(Y=yp4vMf cN"7a#[NJ3bO*V&%<%*`rh[Sz'ȍڪ--DדNJT] *XAH`!H Ш3) ʀ)w!n BM[H [BcW,P|_{+Bq&SK(\sB`~?C bX0p0)/J,*AjQ$Fb1;L ;^ցN,+~(]ڈJ5(ZS{N낙gj &ej&ELfҼLFLҠ@Zv!;9y9޳NKiA^ Q#ćPfScU"ѫ Pڀ@xI'X©9 X6BZش =A+߅ N F9@9񡒑8QN=zMWHOQP %׆.I d~p4D4gSMh`Uf4D  J.YxX ."ơ KO]q$* r6R !w9зL0P5Kh!˾^Wn)nm n> )7@!VqGuz{/^}>S5b)& 7v`^m??;f?|brG)v fR./AgE%-]/WW]_a^r3߼~ z7V ry>?;?WG3Ҳ[)tb}|b;ogYϏ_.ܺj!m|6i1Gm]nf57-Ϥw  Oҟ+ynઈ^@Qq:{ SʜGO 2N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'Гu<%'l8SqQwQA):<>@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N'q@O p:' ɈN"* N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@zN y+N }<'d@>z'zN G♝@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NtiwceŒRO@Kku[w?+&@Oɸ9µdK~(}%Dظ$K }t&S:qd 'sEkY!_g=2|}pejuˢZD~YGQ WztW<>v>>W~pEQ*Ӂ+d :N(Z;\QG9ptJӲ!}BpE{}2pEᆓֈGϮ(J)•NSbWp'WvgfWO+ 8+ 7S+DypEQ)W.n'WثyvE/ (J)•"2x{,uxv{ۑVnWZ8_V*3sg"O}3XmvVI{.Ĺi6 ן%yhf6Tl}N˚Wmޛ-^be}-i,a`VKLP\UW+1բFw]4re6Ӌqr>{P|Vj١/[y5 6M̰xk ~EV*_И1, wAWj!/ \lj گ!*=|E1nEzfhISDq芼.FeCFY#Q~(<,9}bZ膘jhkQl<ȎՐ ,9c65-5C A%"m(1*oW-[z۾gFT ;kvJ;AZ$ YmlOd2*:yﮣ޲c.<_ވ gm:].ϘXLp*?~=zaZ7z=\/;>~ >FX7/g-|?7&Sl(ܷA>nʮCq{N5sp̅#bPGl/os$nvel/kLnq\,nܟ^J\V۳k^lHpCO(rswcbjz<~*G?2Xnr{./ηaz7iDi|=W >],>^.vrym;~{y}мbH0/moBV C0/4A89z}g |Z {E6/?,K.MJ$jn{!}Y ,뇋Unp_B-%`?wϿ#k="}XGʎ5+.ȔѩسPtUKmtbhRXܨ\sycsV^~H8Fo>)nhuOXڏH5P>on!p{NZкޓI˦vR^d/i? (Z::Iʤu'sDm ] =ћ&aWܤ}d%owDv/pʋoV1(!hh?}z35>5z{w܉W}nxA͉7wf|4=)bJjcԴk9 iyv& 8fBd,q:2NB=1j? Q6Čni&=C_~u,㝿`c-滝ˆ Ua($jvtT(@$INKgA@j-25)b렻A7KU Ҋ`S;_0bSwh;bW&6] y#d܍yGdq_6af{[[߂Ue_]Z5u9zZz[TukٻFrW}7^X$ؗ7H ǛŀcH${v&bK%-2e^=DQVUTTXU{jQC&dȋZ4ٚX@ Hl=|̪4&i*jm[~r똊aE&ZDU""qhXd}˘SB逝v c AvgX"h["V[n*W.Κr)YMJX.`Mcji(@2$Δ0$evB첐rǰ}դP{+C}=< OVx~Ev %%QJ>;;GNڎr 9x/9O$ Z(c `/Uke"9XC$g.=}d$l s"FZ\;U$lh2x6*m`}|K:V֤ Uif (#Ybʡ>Z$˩>YIh羼^-ۋmLQZڌkظbr{?Y8ܘb+WߝHEƪ i0xIVؚ6dOF" :gg gqP#ʁfW[n ~g=N&qLzާXDZك}fϿެl_8I.}Avo cKckXn1:𪑀҂OHEΥ?!VsJП#'4s^ݽ܍ˡ?[Pnt2jnFnFMӯ_*ݨ3]=,s O8^߹ab𻏐xwݱc^”:B`ȍw)R,<eL'אnѭÏno\ǷB7hܬ}Ϲ$785zGRǸuI]6K#c9dmA3'VV'd$P(!ЇERNx=Wb";.J\m9ungսe.ڱ) P_vڠq5٬i?kVefg:&wD6}62Nj?Z~=f^hָ20Q+b# 8<ݼL=6du>An盓p_}e]( @\a <,Ø2x1`vJ)]ȕ>I2L6 %|^ӏШ̵\Fs;u^ of$7݅ɺE﫳sGհ j}N} xV\T82 ݤf|-$5(-%ڠuDcoI?!A̓.J-#ht`\1(MV"'h Y VAd 'T6l5R3boboK 9I(lc'\sˎUvYyˏoyx)9 Sh eK~adx9)}M/ӄE%L`s \$gFҖ}1xddstUeueXU|[Nh, xz ?uܺYIt57SY/ `(0nht{et}zَIlVF~l' \5mwO1P[=):or/'Gv՛密.PC'ݔ[VVA^vxYNu^η9{î6zu0 s6YGI20_" IkIerGEw `0B׊[BwP(s,_݌|#=\hQl}нg]~5"68aOTN*=ؔR"j<SNUD! 6ei`xt~iz,0V,V@ZfJh]̚PP\踔w) YVQϏZ &<8{h Q`#R1Cp2S$=9#R,&EZ_Mw(&}G PЮEI/^aċ/=^X޹7enrF@E),iQayjl&fz>_Gt JT%îf)i|vǓ:.ﻗ^tvRBfγ.n:'oM V9t;^gH/>7al{bYw<¼/Ufߕ1񉂽7\\Nܨe.JΚ"E؇E,.k~ }\oA})qY-,%ٱbZb[t")C:|_{υZge͍#hy71=Ji_\X(mA/0#!m1\\!]>ԋyK'77('HoHaƴ*g>]"cx?)dLzwiV.۬nlli;_vItofO*">E|ǧO] ȿCVk'ifW=ϴ̹Ga8~nK߉-ّԃGK7s##Qź 8Gҕlm[x%-SVy-7hD='1q9[2#}d/!5,Fc@58DvJ BhXL RE2x>'.5 k(sȪdfD>D[g=ccU:"p .Kl_̞2D#+$= tWar`,D`VkBe\^W$0M|t_֮&amwX{=I.}S|Iuqam'poinOҹOέYѤhl-#_%8FLP5hDF΀>f~0fk.{TLV*AVqUWe@h8b.0B0M8+"g:`x)wQEaN<Α{%ckr)Dt9{4Id}2BS' rvbC{yW[ rSgݲО_Zh!_[vOIM~}bf$ku[amHbBxVjn̑-O"㠹44xг$,;8Q'@Ý6Xҷ(D&@['™X7̱LXH:O(85ـeCY`D'ɜrԳV~5[pjھ."SU hK69ѾTy1B;1lP*p'WNXs {ԺT/1LR.ڨe e AR$GAU jh?^y}2Y,D%3&Tp4"$, w q0U GāG{=cx7g;]. ղQEMb\!Lf~+.|#"ij YokȡW\PNHnrB!g֠& ,q [tcA==o &!}ӎ5QwҴCY'vPZ Qh'+B@69dPrJȹbddi@Jxnx7f2䁀Rm}G$l+^\t)i^_3,VCqR)s%2PL(LpڊRB.l40A'8Ø{mU2sA*#ּJM776w/J GmyEk-grާPϗ_ƌ6T)QcD/ELBYhXb6o `e"$0ءsIdhvF)4O+Yb BcZ+hS XPM(iDr *wcbq/VHp?qYEN{e]jboV1D2DCB *ȟLFWL$?>ִWq`Hٻ6$W;R?_CɭKrA3~X)R!8UdDcNWTW}U]ך^xR3N$@DLa'NH>UR8]byJtZ"J@0ޱ8{%^)>玭7?<.c5S˳~ZojO#@or%$*70!}`Ƌ"R\WZS?4#UIo{j1d #K&/{28cxx V5 bg?{:][vQD6rV7|uK`lH4#Z=aa];2I`ʨᴶ^Aj1&&'׏*Q\7꺹rmNvqsM=RV>9Ro{#~yo8M)f5zC5?;L/)_~ͻ_}ͷo}w\wջ7%ZI בԃIaCiCx󡥅m65g=[-׌1L^Ϸ5hBHʷ.v;Rj~ ڞCkIVY]E\b/ο2Z'YҖ* !0n@ZFk+0nvM߯ث_uTbb঴ D{HdrEGA ;C=sw7ЧÎfE3(<{Rf&D6):@ߧI%Vѭ;ةrsP1x x+ݚt\Za29ô򥜬cwӔ86iL-94 霈QLAN:" hcL̖"[Flں̒4)jTR3[QFe>qIg&JxR@8Y2[[ dhxX/bGpQ鼒B8_v^_^.M8>ǫ߬1*Ap `3~R5B"+Ĩ,Wڼ3%I8ͦu@I^Z{f%m͚ oȾZKof mޕ|715(7~` ڟ/)喗G?'e8~Jf<$~ ɒ0r}Fv * f2Li_ܯm G{V gaWkGGSA@u:[O][Cg4;u.bY6k Uaޅn1~ٺ&{Ӈ]mU8`W{Nh0 4@?*rd$ztcEqcjx9HY ?_c~Rʮ^O|^)L2ɼǼl]%UKoc}@UuWxMr8xkGZu4YƘÀ"4Ϊb yn΁eGf1gi|8i!rcJ2-'2 ZhOQi5rO RN%*(!famƃTY1D]f "J1^"v wR%&'i1\8%r37ڊ$X<4VR4!=8iB9d!1_®ه"0 i3}0Ef5D>i7;U!1y_jj׮T;쒑qf='ѣKN=d)'C2r,wtcQ$w ${, Z N/s謹DpO&s⡩H~<0hǸYb2 跀El1@VG ;be@5rFV7t%)xwltH/WS7Ԑ}ЎQx/cvQkcqÖNkxhit93gT) _$hIQlHbS…` xՁ$nCemVޥ&+ԖCjf>y̻!A ǥY8¾6;MR tLR)}KdVT\ Eﯿ߿y1ۉr7'zD_G$D^m3mbߦh@-|'8霂L&Z{FLRĨ8*k%$ńɘ JK@X‹Qekl㣙 ;R3'%&imO-5SqX9rAU9U7M\ ]@Is~ٵ|Uܵ>i0 +e:OID{:gN]F qH[#sܵt׮/q|N`;<.w zs8g˶,BwMc{O=8“@x&͟щI])%6"t|Mbs tw7ig*Ϋ;޶@o u/+i=29^i_ǩ& f`{LNT@Yg.% kϣn" &(ס; spEo<,?0N Q6RR>dmM\Ft#"zm@aaxOI,,hK)"p1xD A+T#g"2fN] J g6i:ii 8Jn/mM1#[YrA'21;ѧhsoR\KϹIHՖ٢H^5RMV]e, e' * M. 2^Sz=p jfI֫zCxKl* QhmD]ʅ3e1@L 4/<*h`QHQĦ&h@ M\sHc+Kj-q8K&HjܱԆR:!؍MP &:I)(9 n9z٭b9m6>UB&d˶ 251E6h$R1WLA쳐"qǮTf7y)^x[y7YlMl &4 z"~|GK+qn%NOAsq>p-4uE2 0.5&+dWqOaw(qPnnv7# o2ʞb0<]Qk602,Ř|&$r 2qzwκ륾_b ;Ĕ2g\)PCچr >"JО&Z `NZG@)qtD5x$y|`-, [[2mg< U+,?BZ ?jZ'Z!Q)u j-]L7Ob<T K4Cp<=wddZ tɦ8&xc:Ef.,\RĵL2c: (28cRhNDaV}Χ2 P: \g@G! DCц,j"n:SNiě1qs -ƴާ0 2>QJԞRT>q0>i_HtDbAjhkIݠk]0:{ƆkOUWP] vu5ΑyK Qjѕ((9; 儁y@ 4-lCqS Cf"~-$ |>:Dv>|8(Yf:]P/k|B$˯l?8x)^;pZ 2xN @P*]QDgELwejmӈ3 ;=?^5IÇ4)5͏.O`D `^`ݛO9=n^tgCw?{V/cx)!lYCajk"KIvY͖d>e݉%$UU>.xݵwz^6`WZh ,y+͚z6q4+r*+OHY4?o%-<'?vEBjqhxn9v-`nX8u8-p3\J ȕ4BUF,7hPYPQ1-v==Y OW }|9_7VM6Z!Vjz>k3 2Sy9xdJ< cUȴ4 TbPrGHlM6dOF" @k$3f%&p9W#ʁܯqBWw1͜bF%Y$<ufYb2gt|31 =Nǭ]rbC%3%QH)v zB>N!QHy@Rw!gBK/ϐqmaI.H%Ơl s,HX4dt햼q *y.TtG養:Co<> \[#pf r>0r Wg^u1[~e<9alI?`l0V솱f81^}!jp^^֪WhP]-mԈR*ŗ,,Y]%-sYsl`DaS dS>ShJYY}zidrz]j6 "j \-s] w!_}j->[y-߮ 0zޑ1.&yaGR`LɎdm4A3';V-dRܱFY+ @C Hh^qڃ)N mǺ6xWqޠ7E;V0eoʷ"Nzj^hUYjo qWk5k?!XBfIɪK'4VPITVhdލ@>%ccp،Ճ8ckqVNهԖӊY sndx։Ҕ "0'M!(mHǴRB B٣IB$Ɠ1 y)C-@mw֦s?,6&{s\9ˤl e}ͣ_tJ+ZBa[ #إ|_ٚT;*⢣ A@C o"JoCBDZjJU$3ΐx6cinu;Nώt\6$р3K|v9ϤyZ։%^Rk<[cǬe\ԾS$ ,qך,9ـeCY Y O9+Zֳ֦z Y-]J?.QF2«$hf :tNyaNõ 33:hSI#1{#R2jMB2,#IE@ULyl6!hM9z/uZA0t[[akrPD6هs7\, (2\y#IympuUaLpT(Ze' 7,_Nlݣrۨ۞̮xQ'% @HXy4 ٯ+xp,Ok:ոS$Yg^_8o|ß>},~ϧp{u@+0M6mSSq/{Mhi}m>u{;˙#oh&[~X쭵 q$, 7sܑ/wY~Tn0Whڤ.Y REM! 0 \cXs ɑtC%%Qi%ʣ\ p,@ʠe#ǰQ(%';'sw OY`{RJL&2!QY))7}WRF+]rsP!  YxΫwr\._咛m_w6jtf%E_*/S*vy7xm kŚ>!=*أ Jj)â"]zG{~f#LѶXՙ% vfy[&G.̩:Oƹ0ηJoLl|ܙ['7|I`!FI1FL *"`Fs!Y'AY;mEytw.Xm| hcplDj;mo^( NMul#3 "WR{W9eDsȖ'*`$?[9i #O^OmQ(JgL;/w`{ \I.K^49tuP)Tsn:!y jE}gemsPLZ@7fM.9I0YUinXv<&bZtI,B<0HfiJZ% ,!^ $Qbbx0Z9H@!';[N4EvRw{mMseFi j=7"+lpxJE/mJ)k@dh0iS ~[g4|l5=yks.9.2+tT\+7bv1kBA9rс2w)I!1dY ܟu&h8~a^%DrJQ:]LE䴏(%O%74}iAf8uZl}G&]mfOI0EFOFKq3n%Έ<cT8 g#GN;3\S*$ d2 G3Nʮ,AA98| pqBAiEOd%HOzŔ`5n l8LapE6Q$'d5_InNnл$^]/ l[%~y Dqok_U/YFb6]FE!\$Z f7~*K荫A{ʟ0Rw˟kzEן{ԑ?Λ9L7IOMi|d$`OkttyY5r)uRӬ!ͷ~@>>M<f[Kk^_-nmZ }9|}HG ;-1OsZhf2MwKEYꊌTL.? cF [i}l86OsU[O</|8޲@M{ΨMOy4]:p7;Sfv+VsD' CO@VOfQ'ǻ7ehJw ;%n%rh9!f4,&nk"te9KZi$9dexLRFN2Z9 =ϛZB a&2}\OA*-"N*LCLH+diyʸv^ϪQԚl<0Gov%Dq۪ͨ#|(] E5*Tj:<~Ku ^VNgdN2beIvqE `wpsۙ/] H>I&YbK%zDYİ[ݭj6Sd)nEۭWawJJDzeG( ȈVaY<ϥRWZekd v Xh L ͽ Ez"S2IԜTG#]*SjlseG/ YVLXAr.}:0@*Hrg3KyQhc+/C %Oa/S,ٹH("3A;ɀR(i=d'΂Ȉym&lYz_HBb(0م7\ )2ÍeStYRɟʝ^h$&-fЀE"!991J2)'M뙴,h7'{{NiUprꏰ[]'#MAd`s|s@#o H&W I&XgLdCkBFW$֏nb{ϒx_%R < !Ť٤ <p2h]p;ACc 4r-xkb&0d4}Yjl!pe2PSnQEs/ԙoݢ0gGGbG9`;*$uQ1MQ3lPؔM0}I$ Vҭݦ{.bA%SgfĄ1ωFO_H}N]&`ϝ1NY^Gpn䥁ņs~j~ێqgܪN̿`tp2sw?88n `o@_-z xpy<~;zISb'W\mOVtJDW/ U1Xb1Wd<4pJ3Ν"Iv!?˓̖-m;}B b/s)ih7Fݫx5Ԛk1ԹT $ZzXt̏7WSշ&nIe,>k _^h2̄}(q7WVe УU Zpy5\^\__yMP\]^P{f0xjOAqްFwg{tH.)V7D`@*uJ2dd5dfetW,ZuBpE+~:pUUTXIdRC׌sp$Dž GfWOqiVJ-O+ծxй}Ww61{8vɬHkYɜ~NB7yJ췇eiog"7kDfc/C75} ram >,=`'3s>Ռu} *Vv73 DR,^K6]Siz`ß5Oo{j6{6XflpOCQ(Kۗaht}S\S^uC{uFr]:eVi_Pۿ:8:wKՅzjvAtjFҭ:<vzX-콄}[m V]:tn&? 9f`ˤ||So޶uG*6\r'^H Cszd2ȥzn 8E\3A~KatzͻKJ:Iz8\[mD#@EQŤt!+cMmP{4JK5u]~Nɥdo=18- &h5NwV8+ҞʒR,m-dwI#:/{^F"`9iʐ ED5!1Q_CïUv&AK]Ad̆{P~q2*''\Ha 2z}cީefGMƭc]C*^JѕjԋP` BqZq/J}򦝐Մ|Ewj L]{Ey?QPy:lNmmgQ*CLV \ -q$$j+s{fA,`:fyt)($L U5RU+X(z,wHF֑:Bۇ0G a gR U2trh6FzP]T*D62',M f/m2e{L|5VvӭWNPKkR2D'R("c"$37h()5Zޣa*vzLZ0I]O.I+,7Kw߬#|OCNbGL.4ST*;i5Bbn(ʵ.uO)zܑn:yW=ۖv}{u'Voyq֑HPUjQfMnW}3vA fDV9 auDtC:H $əq%qSue9Q3P%XDtd<"6;n#E"`dUυe)Khbm-sX rĎvܽyzΛg9it]q.tBn*,S>~~<95.gkfJ^ILWWEz'ēz/~*BdBd֢zB\SX=^!A{b<;xe'ff'e]W'Exg.ޱ)PY 'mW[pef.59i}3jbÌrΨmSmGjC%ON£#o.oMwEB@WCYc8j0XpcD;O'FVȾ[)Y,jR@D!Y^̉S1e<(I)rwJ A2L_6#PYx/CkϵLl9}Қ5V>o&3o6 ^m&O.W`K{s$V`Np +jf`Q5H ASB٠EOuǺ=1fıXX7TYD(5B5:0Zb):O>*:k zXB7K%v/lMR|KNN{06] 2)(6/~ ,3BmD/m\C%LY /4t^KލJrg2AN8 bM(!3"6{V$j͑GkxLukםAxzoYHlp(뉨ִIrSk]!Bo]zi& rxgnL^{jdMoY\]ސlyoy7oq3h~Oo{mV:sv 7)1eXhᩗ`ǟ ܄fJQj([)6X_܅gus VݓB4>}y?=g 6l,11p"Q+PLI F+ۚ'P`\L" qnrr52?{϶6yEˮS L:i`zbz &1夝"-JEr̠ۖRa\$HTɓNwv ^g}tbA.=[-Mz6%޲|˒Qb@wnY YA]b_Q)dLɁԣBRs䆺iQ8l^9v@Ȣ*x-pśp/#rÁL ;TSJ *H *[{,9PrcXzۮS'#)/RCAZ!LNX4.jPh䨳)'h2ZE *b=à|6 ,ͼVv}ێ2fYbL[/|Ve8h.&=(X ՌRGSDl$g4ip(wpwC=GXt:'URx]NRG)s O%Dt% cʨ́R.F<)BR0=MBsKDtkXt8;YEW_2ur Qc.$x@vʜCJR/4 L&!XipL+g*K[%$% |>${èwO)2"9I&NJb9  *C qUT-?vRo'͉pG1!W %3A1R?Є$Q"K%{ (rOڹi]G]d]eUŢ13TH %z91!FĨ90#;bp'ϴԣȨ![D YbQ bÚQ,&nc% #$AᮇPZ'h49bU"rRwʃ21~}0e߫.ٝՕBw3zx1evd#h d_:AGLIƀ#[n)2827nKHu ΢B'@T:07 i?=>;X ><_&iaqR,7ťIKof5UJ9X.pwKh1B\]8GNT9\+Ԍ.Tuw>T7^Oϛ 5H pFózoY풣Qmufwvu; GnY([C?1vRrqqXYfQV3Dx̮ǡ Hа,|3EOR._h!o<6Ҫ6NZ1JJVK;>C@5zUr4!RKp=W^ګW_EKјg@criOtXcwO'=]Q+ M9̅ĕqe~"ʢX QDnfqx Pb Ԁ J1W`T" 6ʁ1xF`}}3;gFCY-/q/? BM޹vZ0hlY"Ł<)9a} ph pBL᎞6a=KNsiIJh9n l<)C_Cl& R"Gx>^]IdW'#h>iyd\ԬhdW2=5oyM]Epk}iCgΕs>G9{+aˊ5:ur0=w_TfD-Bgw_/WJļ~]1 Ƭ(RB,l@Zŝf?xz`.|}U[Qy;JƿUư-YJCqLo$e$E'IS| m7EsᴊG+˜$>lWƢoa9>͎){5y]b"a-yOtsXIZ,I$Q<&w¦+枲vx1A6>BmDUEJd ا!'8gY95kPo"Wkd^^׻MvۼMͷo "?ͫw{r5t+蛶mnkmeZNo5Y?j 0&O[r/UUM܊|ÎMþ !gq%@vD'Zօj|y%"/e˖dZhI54gU_oUoo|sdy&O mnu2ua+)/6 )Fa+~^ZtiO:\{莛"'ͅ4GffkZaf5ń obqX@O{}QR+btMQ*T"FQ/?.^<+3z%%B v _ 2RUV]gW2%=zJFWTUU֭G{P[&+M!j cPq $woISm:P}Вj$߫O!oJ[|6֥G+t/vŠvMʄڜ/=v81`AiA>d_<վ0azP-pɃ0q,ﰱW7ft)'wX*ro 8M1<5n9Ch}}fv,R-,-#T't$gT>\+N;Y2`̗(!tpThT ; Y>t ʔ >Y/FPhH`恂N>)% cfMTNyiԑh\aL'*ۮJCw):IA2HyK8J2mv(ڑ Cv:^|UpHY YtS;|*E"@,rR2`IR(Bw>m Ch6=(^.P*H879mMu#˭GݹFUzʒOutn*&*X lBQ.a)S FꙖQ"C+CX֙Q,|ތW4>o9 ߻Pog.1 tQ4)I$&x%q|_2ԫEM2xddJ <(TUMSܔ#g!ED%МNwWzWLqZ N[5V&7\\֨uAB(h64OLWzz߽-nQLT3XA:Jx2OYU(f  `!1:9^tMHyU#RzV18ZHQ;62*c斂7i+HrzU bVΤ>b?F[Pr`Z0Ԯ!ZsRZY1|U1IDNR[j\yP&fZuIr+>^Mqc2ّ{T1}- ;#{JrN߳iTޞNޭ*g\0`lY튣Qm*dĎ;`eWWyӶeX2L.,C+'O1Khx9?r`vxܼ|mJ]V:qN#yaxR%O"ۧ{ׅX37L*''Dž :|:z޾tD>z>}xROxs}6x0= -?~ziFZכ/5lŕon]ݪ-{|V/dkE@}; ? uCyF:#_JMYz0 d~()*~ rI4T,Dq?E/\տK ] FzH]vCʐ2&H&xpSYǭY@ Բ́aIzC=sHNY 0G 4Y-"(b"mTSg^iCIb;;N|') y}b]nzсu]C.''֫gR7~Z~| )&jExTԬhd295,ay.ogEtP<0MgΕs>9{d eEF)t:ҏM{i9;qZueӃ+%bQv.QZc6]FP!6 a-⎇79~?8* ]j+#U5'ṇPL7I34y/#)ꟸ_$ɿ ;lx|\d9`̪xl|vćqe,۹n ivV#I4ϳm -_ \aLo 9,>QU=s_圚s"37Aզ5YJ752mO6OI lm^C;z̀ۏҺmqwW[ζ#j[nwmL dyٸVS(x[?#GmV'6.MH{q%@qDWB5:<ӊRSbU- I f˜,@pǛS ˓7%xh󠨓 [Hy_Jy4_+Eʢ+gҙK~o:l4(x웚^Ng44uǔIOf6#W-n+}v62&/y |dy=jFݎދ,Lcv̘F=~ҡa_Yθӭ|kMݻ_́So<>@,YSr0$D%S\f8.EjJR .FiNފNw,S~޵,")v}0dssdHr^,b+ETIċ8A4ltU]Uq\?6A^ȕ+K1(E0N':kl.6(4H 2y9.<6FXw"k67HPV`m oKw̼㎣T/ GV4k͆덊Z6x +A.TXcfKaQKH%M%IS$ "P$Uj@e-ʊʕ0P5<NE A$EI9{/<$ƒ0ADP~&c/cHd>Q%19e( U enQyDP3Q%DY4eB'"m c#5h&h; a1Fc%Xh#o??䓛Z |r1?|/O]]k~} ''A.ȦW?7<|Ve(GYA_z nz_O-ۺRr8f3fBl]_,K4CX`(!K%,*;:v;!]4aP?h%7:?:(y{p VB)JX>KBQ"%"BPm…s(/<˙s"RD TE$ 1ԺIl=fydiW@Py⏑mZG(YIsYRetT1&D򫥭Qm7AJfIZրb6kWʨX J6ʁd9' VVStSyZ=:,fgI]^OH9sgRX8kq-uZ=xgxRP3%S ABa4a@8in\l+)" "SRGo\*Tœ|).'+F) `a#26g@fX]hT[B=ƒ39p\xCff;nuA/gbWXBl IQ+* QNq>ZQ 35 ^a&%4ҪMm]@S!¾(-1olb2CfS640ZnMHp(%VY(Ғ tX Nag CXhj#y#=dFUٱIYD2pdjI"Sbc{Lw,?0"6OE4E-ɋB(U' fQlF4A,K BXphZDvggZbɁ*P!lY=ir[' 7gE!&%:͒E..vq+)(I^X+3gg@5Pm,X֣]<]=l6;j]c{Ds~ܹ*_1<ܧx`s]\p7YwkUp ?BA=k [:Ӟ/9=ɴ/=/`0vl8;~MޝWTx(ҿ2~}.֏jNܧi~^zaF'Z!4׬dVh~ fg:& }=9};Gv?-_Q_0ϙ/E"<.mk OKn)/6,s=u~~q=wvq.`wO?siz9߷'yUOSOtRpۻ7o~>xuiz#tdE?RP-Zibt1]TZK w卮OOzsn__aڍ|nw\\L_b;:Gr 2=[oо?]OMD~&K6::!S!0BgKyUmӶѶh`mF ,*QM J7(Aф%,P`eȞr*ԢY$%ib`$M@ LȮ)KindJQJ' s'+}P%Q!A]S {#"XVK]j t^?Z}N_%Y'd!,>ZYV`RNTp>l킶=\DWA2pZV$ˠb!Hȉ 1JLJ|Ug_2WXl-zꊊs<_~4PW?w>K緌|נZUh>J4Dž )\\.lx &m Go9eKeV/]YXY#:2 * ҸX) l18g%HߎQ3Lr [ed`|"ruBh8]+4C'dH[̼MԳës[#Ub~xqOu9Iyl R7fԡF!wсж#-eZmCtz:0ޘt|11[ k )I[)E!Ed%!RE# lBwkzH0P<}kf6jK-36qoɖwg>>s~Jk|G &ۈ3<:Ss˦;_x84:vm^? ;G{no~$\?0>_خ;N} |LEetg,xm:QvAy62>=JDBI"堬L!zNQ5VJP$# Ęl<gZ5gBa;n}ha|L~0oWx{dGr &JQ),3J|& Y B omC!6*65,z '/sJh5R(Zn&n ~g}z\jkysWr_1zG,/^1 lx5U\ЯظJkЉFb؈^alqKk)#5(R0(sejv4WOyR o6.&kW^ i7{Ow Bgo6-LY~w4 .t; $BbU+]<}~;}VyYR^>N>ӓwA`re-^'A(kdTTR?JHPC6B钝{'R`^BC d;$)nw ]t/s2هL[MJa?;Y˗вwy3ao~~eW=oNlӣ㿳'O:TgKhޛhE/I(gKPT NmE A(v"`v M$EI9{/<$r.%.G[kKe6yIZ]>wm-vK]rtĜQ@P:@FX<~] miCT 9")u(%9:YZ&0f ?!'a5dNPfG{輤|^'FW EFEz)$O BJ !\a9IB Mriw ]IhSCڪN ;6N $^@tJS*rJXO YJj8%O 5i玟Y5qQ숮gEb&t[! {.h$H#)n{V[[b!UYgPDRJ)koI Qf|TnbT9|ģ#پ yl*Ν;J 썰9B(q s'"s0GCGk%?m^iԈMG-@K؊,;T"uvZVˡ$Hڑav΀ގ< Q<6Zr}׮tZ̥Oyv|Ӷ/? vUo<ۂ Az ?Cv |ax=H0j8v/jZ[F7ΥtT'T('Tyl)FRCt|*]-K 8cਭ[ ?׀evzF ]퀴N$(I}Z~48v,>E鼄!ZyHK!bHCʟ5_9+{FW,1iJA'ǢTrQG!P!\]\&? n\]Lfx[V5ӞOdJ!P+_˯][{@ۺ%0NYU;A-e/YHELtυqVW|=F3,͓#~@?{Y6b.dmw/*yA n=z]rE&nj8,N8 1(*I|2S5! g4e6T[SR*!Kɦ'G-T\dKAorҠ TkL"afx-TmlQmF׊?miq1ߧq2} :?=nnK-=tu)$c2ՊJ(̔BFٶ~Uy*I TMM4&#]2Ht]o\D4uVn(XfV`wFldMtVJf!6`9%ֈfI%-QH%@fȌ UTٰIYD-Q3pd-DdRor8(m0El6?jE"q׌7Ɋ<4OdE{b]I[Pj(` BQFE"2y 27΄ \T2(K3!h1Yd9dM3I5gE|9u6%o]ԃ]/he, rHRhY,g:_R]<]<{lvм5~>.'Dޏ(cy?*`}YN4p{ ZwZfӡ`Eg"et|gƒ.@;PQIhOFyeF%,\w>̆J*ea \W$=2LPa/'G̷yK>k4lbdB'K2t)@ܢI9g*/CVr}8DkwCN*S~[:Yw%>9l~z'zd>ްvOٸ}ax"e'5$-xY$JҤ5HJ$i_h;'"3~rm)Ǟ?lFhs_;,,׼ޞhA6T%0ݙ[&eEAҝm m(l&@;gf[e(6J[c<+U3qkVxrw7+}]~p)Pb`'b G3ztU\ztUGWt4ԣ Iasv:iDf&:a8Bx)aH;^x;^vv,vH eG%k\KZHS\)!KPgE1sl9\N`2 URQ.$H)DaG+9 ~<8x-917&ט}_صٌY{+yq^: DYw3Xw ֭vM8]1> }on ~Wݨq dF]/Ym |W:;y`U\Կ輁:; ;VQ2\@5}E`$x/ʊ `f8G ~t{v~|)ԓqׯjw.%4|V/6eԬz^禵i^Ew)Nl$HZчQsX.u$ @|8uNk6%\c S۾45i?֭M/ZujPM2񖁻:G/{6U%v~†Gcjq yNwZա >1v?Y{J@12.3(Et(!Er N*:c< f#'wQ1L L1]U(Ee,2F5g?{n>^#|n3~?1Y9C787Kh`mm_sI~> 6 w#n|f%ѡP CJhua"{c~G)690К(Zd%CBHN6,9@ʦmU[4>aBNO>!c'`c%]fa:k/"t]p^FΘ )qC!5qt"PrQո`t .ɕ$2effL}y}0ﳂ2*|xF< SiU{_kR1^.0knzw@ XΧJMǛvkw=%%ە\Z{#dXᢗ6 Prdbm=DQCL(+gMf7ig<Hd)* L"iHH O0 * 'JasF:5XV<Fu_?5Ӊ-kҎBE9 b3={s{j~JbzSCjۗB="_=w_WO?g3(9y9/b`]q2:M,Zu=t9&ʫUe#Y wmI_$f~?,x;O>rk$Ґ3AbU_WTUWةPz|WYObsJ|X|βSoХ'u98ξqjD4F+4 | JI'&2˜ SsE58EJV6,H^ՐpQ]c} N rK14h/! L3Pi-%7weBGz^@/i\~[]uƭ7.ݫV^.Yk6Z\)-xRW7}i-Ls$[s|{e `ML>,(]&2S \'0ARfTL9P.SLRE) yw&5&z?c/YcGXhw8A!H[FkOb*ucd5 qmݢT$tBI V@i6( d $0I2#dJoؙ}<3'mPp5fxv hpTkVzrWS盉Wl!C< -}PrҡzPT %Qj5Qzf !9ȩʨǠ8֦ lPwzvcZ! %ӠoFc!4O28:EjOF!f"DŽb1D$2Ԉ H!,5Ymp*YW5[rz@PS3RRs8F TD7n6RDAdQu&??@D],1z(2cDFr.b`YhE{,?)bOb&xOg2ſjg x<"һFK)(tv7 {agòZ-6` trPMf5\ >ͣˢ`͊NX|ZOcZ4l~/"u翆oٵ)dJ[<`M%0nJ߂!o+OEe~ 0|"[sZ|;=EO~,>xs3Zm8FP%W`k˫R^RDmM"+3Z- ^3JӾ;GnlÔ(qQƚƏGo˅&GJã.'mmsBquRG >9)WRDp6Cxda@߿}o>/1Q?S;XYpp0A`? >ϕkaT*C:gX bT男{O}$m߭B(|ȿO] mxS]ѕz3 j>"[<^bn Q`Cg0 cM5;X Gx@,t^i/jn8*ϋJN-/v\L}^>_ΤSa:JÜaZ{M9l 9M!rFBB Rbr𧠹ߞ@a6ar1i笣 i@TL9krw~@9xfVLR W\Ljk̏`ciXLS69q2_ȞE7WD:%ݼi='Z Vq p*Z8_1%LIBV$raAP15&]Xo/Awg|1"k~di?ъ-ܚο9#$.7_sݚ 6/NhhzRwO1xQj^/V;V-4AE[Y%5>|%?@-0#5ZZc%l%t j+n8"EZ&-ō|?JGW'CWQT ߕ_S.>eӅ(dm)WDV9 / 67][?dlU9Y1ΉΰN>"{=.1ޘpvn7WMUי(<0(06 9P\LIj&w~Ȧo܊P- xz_u5Wi/ X(g&!I' Iݠ~WUBJ)ҕV\6J ў r+U[ bDNW %2/pŭ=4ft`j7vCqj7R#@Wx]Xc!q  m+@o:]%DututE4`M[CWW!JhtP2 rѐ%t?9X]%-#юN8B挴\Jhl:]JAPGW'HWMx3 `h ]%\ZUBtGWHW\Ӎ덥GNJ\p464<5̵!,Ҩݴ_,ϗ$ƣfvk7>زvXE!5!Xf>r q)Uf@*U]BH78O-:2;e t<\/r-cсQ__eiH{q;{{йVY_?W݃ ]Hp*~V'9R`E7?SR*~E-_x7}B<\]/ ,pʗS= Jo?3HE'/pz.Gc^wO*dt:䧇:=7`<\b ݺ bB]P_F6q_>;~=U,Ek BW -m3P ҕD +C2C {# Vô1c o \au`Q~=t.s*JVYi 2gXrxp(ߟ%lmِrj"n6$\tf  #GW=]{75ZxJ(K2ttu:tMUTk JܚLBp*];٢9/gx?0 βzϲ[oar< y0x .jޕq,B ]#Q} 0vN1g^< aT_p(CVOI$,ڦm"Y]SU}}<FwUԦ¬QcM=#>I7wcTWH:cW/Cә7?͛78c,q''ؼ)/(t% GOǐI?R%+|9ǽ*|_x'~Nq]F຋6#^MbRۦ+t3lB^>rkEKJ\VZ9Q0-wڋ¨[ NapBF3e)xJ1e"(3 q ?׳{F99^(`.aX]>ED*ՎA:Led\3d `{!  Y\UfNL_/O+x= Ϳ:1zb4F7W2pwlyOjK[֓Mekǧs9k4~%Y!<[krO?Gu"|/*\zs>!nMU'ٷ[כ&o6 ԖwItEyZ{R`,Qh^ɟS*hVI1ˣEEbt-Rf"N:/&xXS1OyL`)gu4.Ҩ/N:s'uN 86c%jt2;4&R hMX-/L3Y$cO~VnxN]n{ R=;$(oA$փ[S?=3wV ԕW @- F fm̩K{rkjv%t-[}Vҥ}Ҫt j !('  i]]ZXkD ˺*C\Ckɬ1/^p'GG&;!ѫpmmLc!Y7x80GbF"5'/ZQJ7/I;2J7f#JVTZb7u ɋۅOwTr=A7^̏Z "xzf2a+Z]6Ӝ5Vt^-TW|I~qOy |it(}#qs~n4v-]|~ߵݓ" %+%yJUYrjˤw?pwSn=j2{G,GQh娕根`>9b;X]2j T%>&2'A\wmd:0kΎ}1T }<u/Jlv؊fWUbI%%ZCO/26#zs΃$OP(SAjf&9#Dlm(6dOAi \;4@.}A؜# Z"9U [+q7Dž5 rqsWs \+q%Ҫe)ܾdaS8eu0TPA4{ &($9{G!a($B()D3̈́ɮ$ <k @ 21dD¢ U2  qBw'Gd ":AEOTM#vԈ-@qy ' rφ3`Ԟ˗7XV!Gu?r_ƺ,>6` c^9G«]9^\5mze zհD[ hU-Ym2^caEќW!jϭ Yʽww1(1:Σ`Z:Axy"1&- r `k!Z ?tY2Dكxѿ6XגE_|HVǸ$mJɁu"OFMk5Pw!1VS\CYc< X8#KZGEz)#H4:#p mukMNw-9.Ws659g_KΙvsh!wsӈ$Ҷڨ9)VkP@iteؠ*$]i 6RSh9~O0[){R*[`jQ+~lj漂Gqg1d ZX!!5gl-a!8BPi{Sf ږ5* :{md=bF$p@?݃.Gܘckix0EnhL0ߒ䥯#%=bټrif\- sALteJ2w!iZ'O)S-0rrgDǃ湳 oσ~ve^rG|Fyh%gHLXHK PD&2PoM^UZtN9F9EŲGOγz3sϥ5?ϻ8jF%6Wi|lFF"ItcP g9f燛S.+S&z/n*$kdqKdLN1b<\2q4dGpu|S<;J=WMnIH^Gy\شZ]'BxI Ϲ~;Fvc3Sw!]vi/Q>k""*$8 ;*go4M#ɘ\2QƊSBJ5mJ;e6uM\xv>>~|PFKkʜn?O'}+ݢo&@S`lMM5Z]ٺjj][Ͳ"G?hb`2YG׋:g Y#պ\[%% @HX<";eF`'j 5zPM6ar'td{W~z/?W_=;.ݫ_Ѩ7q 40S{pPFם&4VT߬jiaMvrf{} 6ck@RzY?حCr9[mVʢbԜ7Le9g_!&u'LE4Z|]+YlB% q7 n^w\~eFtMIDI<(Q(X!d&+DԌB@ (BZk Q3Go[{0+w,<3Z$23!Z PP o3PIIaaܚM|#{ﲼWsi:az=LYx{Vޔ㕟V6љ}e8Xi΄ߒSzŮ,E;mKEJs=SD3 eu84g&8"@2HbdLJ1q4qca>#~ ,v\`w͗=&%[$Si1Gdbթ~e=.5UWb|E¯w~iwrc0VL 2B&s^= 8ZóT&wZ=.Nz{ k*[^Mj)^?O >3ZMvvoc};3Z/ Pog"WXEf#9ĥ1́F7体AgT]ӀV tQ$U*bv:x_ށi!$27sBEhH}Wޔd*L-Njq ޭ[QIMo^_6z |cm"&AaNaܰx3[Ebd0Brt?H 2`"p.0*ģ52ALL&+i@Z'4>oZqm\}Y%ζ:aDm oa'o_BIvC_LnK2iabCOߜԬttVƏ<k޸ TyΊB=hݫNo֖#ۤ3ckӥ[|~orQ[`VN.7WחEC~y$.]7'hoBFo޽l|jc?/6ÿx+K*i|ݭʆJʏW܄+댩v'@#R>m-Đ%YRD!*O~K!Ť٤ <ݪ[ 'OGc| dFNomTDlT҃;bvg ]{J$7[KrDz%ֻT?n% H>Kg\ƛ$oݺ~8:sGr,רfsqF4]F_TXdNVW.$uUPX6KU0d9:9I0=mBޙ~[q!{5pĦ%f=Kw4քd:/'7gi6xCSv25my6>7Dڂgu (7nՑ{RoPv{{PkA=aKmԣ[hRǤnfRu,Ӵ2jwjr1?T2'4AsKoĒ^'3TæjXõ:!Y.)Қ Ial*Dg! o#Dk:gE4h()5Z8YϴZg+ڣ[{ 4(o{jXz۶nmÏޑL=3"#ZEQ]*!hPu1Wp8ͻWemN_ ƀhmkL1v/g:X|m}-#YZ{S]fwZƭ1ՙ;v`e1ȥKP!]9ECD3A:&޺wG%1ŵUfϥhbR&6GSϥVtIRR2sÌ7`SAbh'Gp;+-ͷ=W'ug G)On{bz}g6:Q0giʐ s}B)'h~r4 TdrVr_lC[B+& TBODs!ɝCP^F5]yzZ$4]M_W^|RZgdJi˯uFHK5c#ɼW9yUxYO<#*nn4h^2KZڣ!:,ҀLVnťeƱJL[et1{Jn2&f'1[ϣKN@  e"38[tftؗ E\(F.ޏ@aO4l{Ap&Nz" n~+,_ ǻ+*p`誠tUPzFNpQZqPʞѕ܁HW>zm]`]]o\9r+"v,HAC}I-Gg %_KX.nwSxy._apjtt5QtuteqD XCWn ]-P:S+ghKjn3t5@k=v())] ]`̆jfp=o&Z92j1xt#8 fj5i+t5Ѳ;v(V:ḵθ3so^EdHܲvS:Z3]'(w1Aoh뇀W8τ21,]%O?/\.F^`L$z7_?EpB`W7y&@zqoYC>>|hKm1xbJ?&rz?u={r@gw9?"?wG:SGGy";eag/j蛯 wo3S>1~ } u$W׽ͷw_vhzw~9ؑM%JfQYmP,i>EϏH%6;jA\z%lm !B;:xHo}a 5>! (am2 Xg#x͙!v-hFksP)hR JdhFK2Qrr9ذ]]J zTK4 OƐpw@6()߅s')aN"=Xz B 8  9| }JcEdž+W1Zr MݯDD3'l ]*Ȕfm ?< [h A6[SPa39M~Kh*r#ʨiHf hs0b3Q$weoLQbVQu*L%KTCw:QJʒ{YUP/m>d>C:RIͦaiiɔl(W@x15)ɇ5?л9Ad{5nj9o3Z|1 R^BPZz+ɯ B`5HX/*9C֧yAi5^dqjuIҠIr.3pR5c)Er[K.U{֐)";=w|H.0,ȗ i9$S yBEEg:@$ipaW.8gy8Y5 "(g.y<Ī%ٕA[8SC8Re ,&^Sqk{v&nv4giF]QC@pQ,%/HJl@6lmB| ̡m+"roPP7ebA\R4W'$'7+ PB|~ <`_Ǜ~˚b^ZA ZŬF]`!@یZI|tx=K<@>AБmJ96Xg OPĀdd_*Xq۳ڸo۳Ưm=t\8;aܚs6+\q:%'мmk1.m$>z'PZ5.ȑV@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': Nu9J(\6vN\t6sDʼn9K:!'Md: N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@rNr( h3N @@V@hH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': tN T[rp2qy$ٌhwyHNStQ@R': N uH@R': N uH@R': N uH@R': N uH@R': N uٻ6r%W< c/ ̼npk,yrVKjٺY rbɪWŪb g@}&P gLJWjG]vp}PyOdz$QY ĥ|+.&qIe%uV-}vKRW`y9*ɥ+ִ2먺TJܫ7Ɯ]`#/F]|.E]ej꺺T]Eu%5L\kz1* AWZ*2\ +5k1h?&'`E^v(7rC˵X#VK!,l7 t27 0#XwCk~O$12} bC(}PqpMw ?ޏ{`yL w+W k^MFk q<} w/X<~OWWDA;Cc? G$rp h69TP,FɲhLmm~TwwS^(uߨW W_w"ESoP$xzAlZnerejUf*MoɽEKNk-% $ŨL2r]WWJzuՕVxFSs`oc2 }7970+ ]5Xz(G4sM[1s:3Pm0_51hMY ytW?@B:mxx>A5co*~VJUXX E.V͕ؼ(_??$GQm!:p̀=wm=X$n)_yDG-SmlΟVVqC4'0,(zOrrt(b$(ΩnQiY(ؒκYkץ)<Q$Bj X5x̼yVoz? Y}|ﳽל/m6~e?vֆ}3f{ژk F^?ҩ_~bO_D-Mm*3!" ?{gcyʋO6"' _MF.C=HzH"c 0K[sWfp?=8w!^Ib=-Kq4ۆpl@,~ _xX,~4ڸylP+tnx0vy{|`0ɵcv.|mOoL_@#>ۀ kgfPȞUGgkp`>Eg aT $́~uꏪBRmt NJ{Hu&He9iK!m.) <̩DO!m *lNQLmfr;]s[Ly1}Z#ی |Ne;=mGy"n$q\[c㼣IG.T2-RKX(=+Qa9gywk)B7Udaa&Lw"o[Ϗ]l:%+3#p JKTy|7wBa D. \Y+rE= LYRQY)qa@AY)F~YA$)Khn"!ED" L S_ 54pSZ(h|#))ASErU3!n)9`m lH.Xs.u㹾rLQu2W*`#eC26~%N{wY[P$H`{y5Gݎ,r7j<+ݎx^ۃЭNmh+;w8zw܁R թQ fr&dje3Z&W=& ӅCCrG.76.!JFɕ0:z&r-cU_<Xp\}|Si ;MFj56X (sKtp2W)Ğ$3*1lNq`jyTQJ"E56,ql@1s2&K$vXDȔoJٯdZ&fWv18VkZZG;sTԜM*:ddFL#bB: On;i"a{;3 cr& !Dٳ7^~MTƗ,IkpL`7)_B% s0;Z 4@q5FY#y1! P1DHbL^x\ҥs\j}.,9SĨ, K :R[g8Rɑ/޶A.y"qr/,ei_(lk< -}heў8j-A@#aGE$>Rÿ"/{ਾn9J1ȑQ͵HŢ8,[V6Cvr394Zv0g]y*vE-<+!ݻwD7!SpNr f\]νƘ+sFy X. & A(t3ccI2-Os)l })CT%Z:9mPhMCw .&WM-G {/'|d l=Yy >z<1aJsZß?TW79Koǔ}/jQ5c2ߝGT{찈$EO(g 4ɉTv,_eRh"}QE,;S?6Q3fu u ;I&sOjѝ@hw׹Ϻzd2jT hu=[s\tf2ݳYw-\T8]U7zndGVc>*ø&M{:np_k;q6h͏l${, gcϭ7݅&ӫG7>Ǧ] # qd3AJQ]}bu854pI(vHS6]YsG+~iK~e<(u)K~$&M442ʬ\ɫ.߅px]Quַjrcdv&WX),rFIG5gXFdҊ)1tHۣ \bX ڇL !]2Gb@3:Yַ9 J_Ps,ks 2jF_CuCD;MeYEMT~zfYdnlח]~(  &L&R䴔s&Sts !AEU֏ږ{xt>{cq`<m%,x#K5/W:;~whf>m9\{?Nɞ #k)(Bz6(19L\.1O8d(eJRNh& 7!9ryCg3ޝ:]oɵf6m77r۾  Yj@l2\fNT(R@)*ʨ.sS:iˮyICQ*֣b=ULT$Gu&4\*øS"QWK3--Y+CXHX MR SJӓ7 rE{^%Mz+}x#֧&1%<ts1OQp pVo:m/ո&R'J甘hh 6h(U(Ѣ DJy Emx'qq5=.?Q&+\\Bkԁ: Jb!h64Op}NVK1IP`Q(S)O|ʪmpਐ"0FS|NI7#- Cሔ83rŴ2߫s-$F1usKư Yb8Llo|Vc1u+_n4Ps_pqX10>|*ON⼘~bk~CYfǿg_n5,ߩ\_6yFo#T?q{|3wWpD5PC9aGDI)Gv6gnh6G<;cS-DRh,*t Mvr 'krW?''ǙdɧT~Z/SiaIRPzpf_UW!۷ؓEe8NOao #N`K+Nq kUPkc.5:mO. 7-HՊ3hƣrۍdɯ`S;YM8NBza Κa0&|DA q20 g0bɧD}FǛ{e^liTjVv\HX 8W*IzD:Ս*yGyEcO/?w}xL>~z_ppD&`A;wH[]cˮ]Vm]D5~o"|=ʠZ 矊_/|7 ?f2˺xխGlY(A$a$l2nǛTi)B!u)M>[}jk>wH׵$vdeJ)Gd$wqk%AHxJD*8 60`'=JzuСUxI^MVH A NXN-Lrܾs3m'Vxwύ1;z}gZ5ɼ-ʛWF ͘H]%rP:l?cs/_&,Q Mle=?F+⶘S{nM<_t A0L$@u" :/NWh8͒Yk᭵WR۵MuE4pM]{YUhfRJ:)탠R4Ag$Ȉ"Γ"< "d^x^QJD5P8e;e1Za, pNR T3xϼiꜗj?-Z՛)ʕ5@x˴s+H&Ƙ K΅9EZz/)-^Oà|4Bπ< y6,BpC WA0% LS I6X9DlAi+k628ɦ Aۀ #G丹KdD I1v2༷ 9=Zg"L'7H}kRJX 2ц␕K)j۲ AdKR ~vF9 7h%&N2)-V@F$ AP'gQ*yJEu;CNZ[k't7g+;" HrbD.%1ũ:55S1N\qE{l_?[:>#S[d >rͫMRz&njאݦ"?mSBE?mjWﶜNrFzٮ5EMmݖfW5Y?^|]S.k8_kmcƛa:)p ʃrnjت I(g|1-]lI#*VZauoLMqv[D{Þ+(*ߘb0VyaCȝ"מ~-7̏:nIkQol1G;7k~5斔!Ӕ Fr^ŝoOvvtzRA Zu1;2|iq3:dzPl(x'kCtajtVؽc=m^FdUۭ'&OFM T?u?{>v޺Ŭ$lgUۯwt4&v]6LD9Cq3)9i^j"Ӓ). x"Q?kOhpʱcCtւ#O3,i&7o⁢L 4wmz^@NNeZj7ii3 LK)[ ;U.]G= 6靵<*Sr$d=nfa`F; t1H(hEn(J胴BLNy4.j``S,B 8ReNm;+q[o3g RNmh%A=/TWvͺN.^9rV`zMqIG*=U4$c3LseRFI#SEՉ@QidISmMr;,% MdsRE\ΠE%<u/1&um^$gIC(2r^KRk>!RKpn&u2\ pjpe|qڇp`^j?* \= pХh*E{+ \ejAv2pup@Nu>> 2׮2F pupA3=+$2WZy 2WYWѦGpsLK GV]A:DP=+$X]eruoڑZH*Sa`pu8pX2 {W\c0Spj+E 0i%-Gj)1dqS,pgq\1t-_*RjaʓY!n L3D<& F)B*S:L`s*þjÝi7d::ǟFN G FsxԩDN]8=J c(!;JL"%ܕ`, e;m ǭ>6 jMPz]Ksr+^d;x4^{׷*l +)cW忧AI6)e G,,A7ZV񐂃b4ٕd g%]H{~=6b_-ng&|N*P5sU/D&Π`AO !ib**U -ltr9^.f+1wa`T_u6OM>**.WltSQمcͤu(h0X#U41Ԝk2suE;@lZ/i7^]KJlHc\8Q#Xj bcQ(V&0Kآ5t݅^q}h`? ʞ~*O3݁/t}5Dsjo&Wrrlfy@/7l}˲1+5sev{u'8zNt1<-$,9_,_l-2UBDU}_"!QftstFS&S2mS4liR|YW%^EL@5zNuN.;{sY=5H#O;`7ixDY[ۖmiy7ˌ-#eEY غY#=V”MގYL |w3Ȩ\ y £]EG7 ] yk*#&}5IEMZgڤSofǪdgsSa&N Nu|I зAo>߬޽8_r7uʜno=jmBXq#>{VcQmî֢"nW61x`yӚa߬d\CłsYw͛ߖsO^6MvPM`􎱟h&w3 #jݗ- utO|ԆE~;N J 0$kee d+Թ >xRd57vaΎ'L,Y~0`82:0U;+Ed . ; ֡sN8LJA̜Ru~ ͚)BHTlӪS/$j]3QEv/8C) z҅+Drrd#9 j:+H7q~VA+ !0`*%d-ʡ y/62f%NH C)g C\BYFggoV;UZu:sN1"i49 LU8V(ϸ5.:v#<!|T2Nv{&?ܭ{3X~yg&ܕsɚxsXDtFF8`zxJɥxWMTN0Ȕ(TŜ:WfA>M1 z% }Ԛ!44Z!RJB25Qq5Z_\ (I0 1 J9źҲP(:&X36QI`snh|N$o.kr.ƍS>\;2Ss.v\!#6Ԥ,d֜QPBvdS.s`U(fD/,'y]}+*sET-j#PׄsX+Pkor#2v*ݰJ7[mek mg[h'[xQ[Mvo -%=TY y6·Cc€ޔ0fbB {p xMNBV 4-nrd `r$2{pJ:"\t' 'o2-:bUN4Q\b[bJCkpM(rܛ(Xǜ"uE8N{54QF\Ơ抅\gL#Xǻ-+_#_A=X8q*xv7wt乔Ac8 `bdfW'^0eo=y{xɲ:O¸JE`IZU< hسBvEsNL6ªށpdi)H cV;Z8QS+43s2V.y{u~A_+^mMp>W>^ }x|~!SU| JF ,meVeo}`'sDu5DuFFF|en969)1R x}}2d֍&,/ʽ]ܗC7x-.xF3!e}f|LZu)xEE`i}7i#|2o>U~jn~^zժ*_hDN0!@"Mlw-6.7%K6ܺ䝉Pk!m *?-BH hqVAT1Q5gdDQgh{NUk(,I;vao\YC Ӄ?fj' ㉇q~Id?M.:BZU#J]UD ۘYĢ]QdƮLb*QI^̄)eHcNH&֒u86RURuY7^0^~Ms)~^110ˤZrmy\-܇DSբK O!^pr=D!/D!aҎBˆ($#(|6*5*LL5KL ʪ,ZDܷE$8L ӂX;J YCc P5!$7Q?2p9!q. rYk|nđ)|ecKV,90{h#1SwziqVV-ng, m 8x@Ňs*<@Ryk[5`-ݷ갹qY;: {T378̜fP똌l1&JJ٨H@nJa: B,rf}Y8mO j6: _? 6`1˪S:baxQ`_DC &PPْ1)D}5h XYQ)A4I 8Q$e#::vPXht͠3N&q%f@i7m:BT7Td;Hoix|ժ[ST{ns5E=LJ\;w .p\N*V ~ܐ>Bd1^)=nƦw`Gt`;1Rg=ފH$aΖbƨ v[ -βxZeHSL1]}v|EElD1})>A?{Wƍ _vnGF\R]I[ %;ίᐢ^F&4c+x~ t7b8I#c=seZrkq:EyˎhyTq,>~Hw$G][CLzy{C("hAG0"8nH2皮jʐS(ΐyҨinu7\Zlm5.:Էg$.򥭤 s6gc.3XIy׆"9r@61cEY,D6VNtgRjPZPw *~ $NEPYF(E+b'A2$WAXWr9Hl,QTO}oz ~f/%iI7@&cPYѾZPPE. \bc4ؘJ AhjufjO>6t6ܹqw%pt. *; p,LfY-o+#qtB|~qڗ;{^4X 3pVghwz,3q֪ >%ShL$Yl +CQۺAlg`E_3o&zz#!F_lRsYbLjF̡i.i4X bTcmA;:i^m^,ha>RՏqikPJmHP(* a 6y`6>NyJyBkx߫J{X}~ظUt*E泗WQ\P M*[n+)XrT,.,c\g5^N ^lrŋfQi :Kq\r)֠[N%T1uuBY%! #uQo B8rKk'+8[&]ٴO'B|on/[JuH:p4(.}6k 7*\fENcJH&scX'][W5. f"7:sc(QKw&b,H/uREg A!kH_Ϙ](`gyiBGeۈ?3.3"< RLH&:ȬIBíҊ˜+H72,TK6dfhfe:qʽ&XT8!$+xKucE*Ep)䎂p59KViь9v:dtaJG\)6tn=.OdϿfiw`Z&bz[ %Na>?{4䳌L2ki!Rũoha1ϋ[ O-jx?݄m'EŏUmS/hGEgz< ?8eF ""/s$|qXGyt|d, U|;\'%d?ԙó$EyvNpiJՒ#ɕwj,gM^_5xiF k`NY9-V˱ AQrAD>;|sg=γ$c-=ɍ׾ޓTt on6*bʳyQ84g_Vsz2=D{%zm{WVu J+HM_,ʄ^L<4Ib +a:@ߑf,IH*3U.d]&JEHyŃ-(/xɹN";o$R>;*_xY5F>OCwntCWS%EPYm%8Si,|K56^QGA<~G<Q҈^Y(4.x\ =Zl<.d@8SX6\+EN<+Ӹ貔嚜݌~kˇO`S=]adt⨘hl*"9-W\ZÅt j93*o'Xo߂ǗP-)<]]&̭Vvvo:o7!ʙzGSꢔop"0 (T_ydK 83tO"=wŊ=gmR-&GYb7jJvVSmQ=m!AuRx@W6TX3 1\^̝ʁaNqө>~J?n txzz.驗A9O/Wb}c])*@˒TpˊgtFoňHWpy$!BRi2-A"<9 x<̄ `VvHd#mR~4@K>m؉Z4q-Ӫ`hW=E"ѬpsBz$89Ó%:F 1,g~v~F㣕zF֘[KVJ&)nURJ!dsBX<]SpLrB7RmcZ/CBsCҬ䜤 TW }{!OmrK3e(}WMp ckv2A@vd/uACח_|Ӹd\k/"ɉ))G@,gGm;Q{w5OWN`iOOYi`NyZ379Ԕ&OY]?>*V4]g'ɫ0}?Ӳ*zys{ݮ6m:p4TZC(m!25kη~6-Jv9ٗe O^tgs}mֶa;x[[V秧u_|֐놮$.?,MЫXwXuֽs b [ V;Î3g@ojxJl hd06uԿlą+a~ѩtcL:◦5f-e隟Cr}IBHռ gTWn{Zi -Fۄ35/~OmTdQĮ?8,,Ijrb6V7{Lw G8w jӍ:>6[|ͫ<ޛla۽GgS"e2@S_uM4Ϥ픷iRLnc厎V}aβˆO qD̝yys`3]XD)A;rԖ̭F}AP2lрbĂR&c:V_,~H{Cu K.S0u )JeDi^sѱD&It!=X^UZL6Yiw|$@ sUoBZ9J@}s?i̛n͛˛FZ罁rWbYnxhz(wE) nyL=9 BJՃ+Bkm 637ʲ!YW؈u|0tEhU+n409ҕ0;]!``j7"Nܼ~+81 B\$tE(j ]gJ9d]\3gT3t U+P *}+BGG2*4\5W侚ܨ67ͮ& EW dgoo>9/M889wTPq6a:~+h/nxy}-0c `y 2R,(rqkU3yYuz2sG;(oτ]^_̛,Ou`S%TMpMߨg;w/+ͳ$^`Ye:xYsEl@vvٕA0 wPn^ 7,AWW!tE(l9pXWޕ -ҕFmz \; r+BGgpXq˫x4|/cdQ+l0W$㋇e8Y1WJiIKFYØ$8 hj)ؕUArawx_3cCbwTuZWX%NWΧRQ%__RR,}x+`k@vhڣ@?@ADkV%K}F av0tEp;fj7BeFzq9$  >J9"f]֑Ow]\5Z;]ʾժIJJe]!`_va(tEh;]J9>r0 B Ǻ"ٻ"`ҍttS]"+PJw"zjG2)5/'y9G9551g%zs? 2~QsL:,ȋ(S 'OjV^\NP"S04ڋH,g\{H3W"@v%N 0Y Z)Q&)ي )jtqt4X3鷪N==5]97ֵZVMqA+R5iVi!NMzoe[zvgǗo^ b_o*]?W?z3~<F@N~wէw/K;~n?y>9rVNs?1>9*FӴoώ? ?OO>9Bϩ8e@W\XVQ˳r -kp̚•\"r0 Pkq%*[puι{vJZ5 D׭/_rp[=*l!ҞoĜtg4ff Wz +&JVijq%**W+.?W"8ap^(ZW\"G5Ol0^+Qq%*c\pu2i #0zbPԚҺW+Wj*  +)E-JTOhtA8]q"ׅQ8Q8/eq!j;J;; DnWP4U'QKW1FȥaEvژ,ꗃ+ЛS}}7EnPjO+Vi*yf2pe\=v"3]b < DW}Vœpq%*מӸppXe•64 DQp%j;D_!(+?e\RJ:DWVE@`W")E׮DܞʑV~b%~\\Jٯ]ʵ;T\d%ǿ7_^F7IYۗΎm!^}y q oxu+vu6YMλ)CwьHX}ton;DGn{vtOrgn2h)MrkUˆϫ!gؠ?{iEU(Td@W͟:G=aR)2>]8\]pX~Eы_KPzGG?ޢcm՝dhjʹ/s;϶ۣυ8 4Wa"J~^TūW󤭉Lo{C6o7! 1gwo#όFsY{&[viI-a|aEI-'oJ;3 DRTƸq& +:Fճ_T~9[?U>*? r\R_\MSK{rWT\Wz Wէ j sǕ zJWx\} ꜲX՘= )fBnV[rT֝q`%ӽ.ǞƎ¦&>ُ +ߺ1:R#XA;t*dMmN751]:*Fs-H4hFJ9EmT #,yjS,nmY>d9IMLjJ1'ZpOmKIb2c֎7f8:CZ䔔;y hB0h^ MVr%;`v^Ul4#tEt(41T ߦH.\rK>fgkZAQ! b FlT</՟ͫTڇMVTHhi/<`3<1'B&9F $W-wtM%%pN5bUTwrҊJ u})]Dk(56)hK si_#'֘0 ImDiivbTCJ PkUxKd6WѓvzU5b>#i^,\ O hՕkʧ(?SO}fnJ bfC)F EniW5RFvT%x!Ly;0͐V#_f G UMb5.rb- OY`boxi`Rs6;`N dr.j]vU' Tq GW&+˺SR&KYWot\s .s QxF@s8dIi1 k9`QmPuAP#{*IUDԓ.eX|.ߥ؎~5me ᪋<J͒Q@1l`WP&4$i(R,`@L)J9ijT2\Sr5M'$b d:zn vE?2U ++5q3Fn̷6( ibFBYp6{4vCEBy>uFݚ,FC7a?%`L biƗ%B`YLu>JATZ|mL;9Xi65lD.0͑6EUPf=k#%l~GBYGSmd^]UW.#{FtYfD5ڽSSRE>RJ/WbF\uO*DWE !0!bE@/ !vvߙx\ZȲIzu.O¯j{WՂY-@A-$ >:%:ppi376¥6#JdKW=$J2a1%OpHv9ok=/S(39՜`$\Pd"U̫ f&dmZCH1-dx $>yt_ 5Pbp\x ,nGfp60/EUwda\?V y E*Uh lZA]֊A(NEo#C&_]^\}2di*YrS u%V4XBh;K)G buԆHTʗPw|̡cL#tmG]K (v 54K s[O \R+y rN01^aB Iݩg broqGВXZl?tXҝ,Iʣj4 h@eVo Pjj5|VUrDo k NKXG@6F+ƁI![[+m#IЧ]S`A=XcmnWY\S$Em/od*$u$vndEdL0Zyvk7<'0:C? 4۴p2k3JJ 2P7U).ff8"fg`بT8`ؾ-M=A2 Vr=1^EιgdUGN|3 cLu`Cwr߬S*]bpwI 0՗7*1;hUxO ZxqNae)7֋ʃXyiN aD`;}|a8֩ɡDSX(āX424q_4OeҝY|*T ZWL cau `!H` \, &* Xc"*/ղ!dDN[M3fUWM.nLTEm38K 9^A".IGq3& 8߷>ie|Waix>믿~?Z|(bxA) vI3 eJF9M&*N/?apx!Mw)fUͤCIg>=9aSNV8d.%Gu<V_̟>+N/5!v/y>!Աx>hr@ Zhqq= Л"ɒX"`>XXcJø @|N?Gw@@eZGwp}iY*Hfn;Ri_/DuހN` Dz TKiF@4z,pe1W0ֲ+%mֵ֚Oŝj_DL"|6UY`oq}vWZb1M)(xFI1z˭m~ ]d2ڗG~:l,_־3ShkӞd|,_H 'e#NP"g9E(rF3Q"g9E(rF3Q"g9E(rF3Q"g9E(rF3Q"g9E(rF3Q"g9E(rqEXַcih-Jl(_fn9jշ"%^ՇU W ~kݫdx$Aga:` =e30 n>}zU}HaYzs+-`:Ue(;|<*%C*P (T@ PB*P (T@ PB*P (T@ PB*P (T@ PB*P (T@ PB*P (T@*I K%P1?Z4zJ B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T=_%tx?J ;F dtJ P @g?*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@?z嵩TR&> ߿9zw?"p9YKz#\k=xX p雮/!• \s \k \0peoǸo:=1\\CivV yXpeoW]Ϩ1 `hoઘ{WZa!\=C zW`aW\WZuX)3.uI׽b}+eX3+ɔ}bW`UaWГGm{UV<\+5Gzp`IMoઘz k;t*V ]=GҜ ާP{1X]sUob9UcEzpU$6-b>~A)L>7cp:)c8CC-R{; s0@G_m/F0NV y>ƚ$5&;w5P3(߲%>2e~3/t ?-qW8^tܥg呕15<")K*B S0L?b.MtXC,VZyжEZR?O4˩i9I9UK(`5ha\:Ub_^&ţ Zs"\2{1s\se |ܔ"`z\#`-+A~0ʆ\9Aabn]١ׇWJ~r7zwL`o[9b}Yyh pLht BR:\+ CzpmFy?a}`gufcQɥ'IOAcz`$4:stR(Xc(>rlj306GQ&?/ /翷7eMsӏ6`2wZq[. PU92.Q*VU,e!ˑgy<*ݫ wtTeZ2L&u#,pgkcݭ+#ӛԋw xjChGܗ}߶dpܶ2+#T8ee^[ujδkIxێ[LS㖖'-ݮ<=7G]  ͍ܥf~7y[r}C~4W&oys {F{>쿃lht;Q)b 7hcwo{wubQ,E9 . }XΆoîY7ϰۦz*-yvto4}|mgsOLtcLOlN_w/yjE6|Z& [t5 7zHʳد{wkV=SéSJ7c`3_> ^ʹS^ 7䢞ԳZ2;#w?)+٤.Nz18#=9 2ƊL iL] -̫m罬^]:2¤fY 6Cl1U׃a}  ~VBك_./^aʕf SK=`=xoVf WV4FDz9'^|eL.ǤQ\c; eu4>{C(KJ$#NJi)gx=PKxK\KPKDKq*(8 TSn*W:>J$!*":rn9lUrw/wfCwӕ۪4\$Mv#^qtȽV7n+Kl2"ӑgbKFk)#|v"M%iJTA{"МdҁYm@"0DV&Z쁈lysRJ.HhZ:?g>xO$*ܺnӰپ]Z)ӗoy$y4|[FegUAX5L`a&{kzasMt3xYRBK:'-w ؇\D0G3)ѕG$̜'u.j`d~\fGSVÏ';qkS<B SYҸ7[w}"U[DSL U.(/ӄ[+44^m@[%Nn'޻fݻdPt@pW dС@iL()W2𤓈2tWS6'bG2 TƹJ2] PYEN# !+?hCE5̜ܷ(ފF=³wn,~Bg\p'vlEQK97S4% 4}'՜d% M`HBh0g2Le0UI"˕Ihi@JbN OL!i+R9nd Y`',\ACP|K_kboKI9aXorBOD1'%@aS }llll9PFD敥k=C(g3++ ÙfYwM;5LM~bw=$IXvCooJ3YʗH,6Ͱ"ec )@Z4*gfeeD<db+lImv[=AOI 3Gm* K:ȾrYJZ{L/oވSAeT^hFs⾻>@m_t/_6.GŤ ƤeCE]9Wke+^1b̑J(ebn2y Z>I mRxi,Dw۫W}JH>lQbr\Z;KTXh:xm'H.z!0dqJrt $#4֖*ic<&2ʢwx&糓ԃVky:Y1'LR`R ,W) s!$:fkKuJU&qšKЎC,#&i.}D+ |{!ʁc 6 Ew&ׁ^[&hmG PԵg7 pQpAvPEn7ARo\;K hEV$tq"!%Ń8YۉwwP=TD>;hNQ]Rqs.~5s?N99;uu~dbz#6{)|gl-5U>J͛Mr&vM ]Ƕ&d4Wkmyڵ۽ǥ_wYaۡ5Emm^k[fgzV~|LF?Pn63ƇЊlrvV3jWs! ˆ|ByՏV?mHouG *,(j7g\6lSbeB -?Jg5 Nթ]teN:+/*oKV36#py! d$U2 Bp{5#6#mje<ϓٟiy%[&_rO3: +c!VFUyX ڴtxʄweepTrd;1x/:R>j̐[4g&hOd5*q5ML0Q[cJK(62pP:NNAzm A ̝ ~= G\1od_[ScE6euwp啯<[:Y\~Lu͗K7B憊Q\6M?[:rSv^V$=h#GSíI9B<^r;=LБrv Dt҇ JmEG!2Id:5b ~!G[jg1ЎqZȭ4za%@c:FJfIKZ;ѱu6-嬖1[l9ݱPفgmD DT2rQ)!9ͅy{5`Qjr:%p>ϑ4,_">uۡŕ:Iҭm4<yQ[t6(qi9-$%gey<\088cF E<)tFJGj-iAORG#c<:yN+aϫcMFѨ5yc Nh9ᷟ}>_>㯿t#ӟ?}#Y? 0a8hS¯{nuߟ~ٺkB#]uM-Vnӵ ߢ_κ|~oaykv !~8e8%sbs.&yAZ|Mb~:?;bĭOSB,a݀@gRlM>hA~<$߻uRƆ,KRK0" ׌4@\ʨ\RS)h铔@-~s*oO,?M}e% g[LLL%m"Wl>?Ka}Nuٚx#]vw^=Mg ag;/CqFˠtJY4ca.rJ7l.%!!YFJKTUJ *zHYEk+t j')#<9>&l%OdKfҶ"sANMZ/mtp9 .ƵAkÇ@_۵Ev@LK$3͜ɼCe'D<s;>m>G tϏnGs Ko-ZQr .*I1B SY\t+쒗EfI{qA:g9!Z J Y18p-"gu=e_wϢp UPQDƣAfɛ(wQxJ0>ǜ;Kp쓤w.)J}SRdET斧hb88%rgn-HF8/ ޑ܀Pwb:"e4r},|SjLY&{`Y%NT[9ʝj_2ϡqf='֧ XK!6ѐ L9I Hz{N;9G~/ kOtxrcus0P89d; yAlS.VJϺZ%IrJJ*2f Z*i>#VWܥGdmFޅl9 5$!Fp:рcBP;e6ՓxpGV ivzXn|Z6)\P}b4wzr٭n:dt28ecQ5VYD\uPH` Kb) ʠ D Q .KE^)ͣF9bhm&#wEpI&$Ι+E1 N:kDN=)푤iP& ǂd|i0}5Ӷ={!'T/pԀ1Ϊߓ.(q"pUBPЕ)W9EM iW:|Az-ˮ?1_aUc#BZNZ Q̣.pˀ$+?{WF;[07X`oKcp-ɖ[Vd*n%؎bwUybUS} 67ju6/WoŔ_a6,֑c ˌmٕZ5׊y_'?gIbyvKfb4_vwĹ?o8^YW? ~f]5RHl$V*C0Y+ VQa絪\cS۰{L6m(My}<zϮ{|۰;'V䋛2{cبÝ=p +QZ'Zl+S@ZGL}yfUӰْR!58ɮ'Gx)J`s2`T ;֞8{oUUAnBHTǶaVφ->.:᛿jЅ*-K.ghD"\E%>t u-DT+*,3Sͤ2ʦ[9UoC%{!Im*cA2ɖPˆ"g\Dn&fibk7ӎC668z\Pa*iDQ;Q*#n ~`m`Z+ql6y?Rf AأH̶C DgΆZ;G {Txcr%:]SmS|㞝;R ߗ;S(Af22yDiQECaʺ (sp*KDB(ǂ&ooC;GWQZv簺OTNr =N)4l̉$&/QxIRG+dP('&$1B$'נddvڑ¼U8kT=?{BJW%J;cM쐌(BjZ,pXȱ΋-сJ[z ;wHhbK HD[>3, 1{g_C+90CQ7E(T.Z^ha;ۗ;nGս}(~Y9&]݂_#>P lWNcuN$:gw<t >R5&oPٖWФ)<;. {Olf)Z>7Qh[@t"cdΨ2, UY)yyu9Dtbh˃*Oo=ok>G2Kk/P*9U\'N%vvٷYJoٷMo'䮪N]Uqɸ*š*Sݕ_-+?"\$qcō So brL dv)TzθԋN#]?cmSHS]ax?Y&NkO/?j$Ef١VcAg/1=-d;><+7Ov'(5Fm<%cea#M H$잡ҁ^.IEf T=퀑mZ:غudd  =vnZ9UaY9 &FF}%}QJ#*Rjcc¤\d2IEomZ[BjvJi ?=DCNpb/`as.Id[Zc*uV΁Ïik-t 8u'<jv9 Akm<8*'z'Q9I|qkJ+'e6@arGl8.d&c.5/Pذ6;ػón?T2|IKotQJ Q1E E0t![e^/I&d*4F#>DR D3H)y[ֈ Wk2d28:~顬fqPy(M=nNo*Q 6U {h),z s-q3W 3}-)rVg ]]d1HSU1es+-EѶ(!~]ߑYR<OB|>Un}݅1ujn~Md?fd:(\i(@ ,Ap@EJ`:بJ7i ~g}v\{y,͵9l 1KhgһbvoF i9oRDr;,pBh5 =H!D!qR B($ʚ EJX #-Q2kڧ 2 ME3fGJSK>1N3Fw t1Y#t ԉxDNF8{T]!AN< r.?\nxe޺8 9_<95a/#mcn{؀ ^ujWÜ魫uiE1r}ŶH)tLЋб7.&CTt4Y dAwH@ YWT}UcBOK0-p^*H\HI r-hǽo|/D{!~!i\<x}qoz=uz!Z'P\&4lOɺ\=hAE@ӥҶ9_PǺǚ դFVI͆r F,ƜFbdyڈoPբ"Zoula2}{o5zn V3X~U=AQrUϽx]_Jn]) RZ1(k:dEFqwku>(ԀXa)RHXB4I),m&:]HetAmϭ]nFE 5?}!R %v }F`:8OCT[f1@f=+"$`s,J6͊ǷgeT082@F–gNS8 KRˬARIS3PAB٥Lh±CYUc;k&Ξvwm$G_KCFWÀqp병]g .O0E*$eE{=3C8% %ÐfwMO׳wfBܶPS6,D@qDiJ #)D ^3e#= t_i##ϼS RpA)\$I8uRzps/Md8&iSڇ*yߛZZ~6OZ0:rwcɘP*@4"g&LPM=)j‰D*VH%{щ#Fg_/%ʬ%zg<ɶI<j/N25FZgCP)hqq \K!EHŸ@u]Di2㋞ uD|c܂=(z#{t˘ H>:IFKFĠ?$y$N!$8޳Gw0hp}͘ñv7lRVM(oG.mMJ 2OHM)D'ӽrjXQJH,5ӥܔjZF2BEi I@y-/"%F$ Mh}0ʥ2;1%`h. ,SWحU*(E \(86^8vCD:Ѕe]P`H][])4ݧ+{Öײң@s@ hji>#)t$HLNG!|wb|iEfڞ\@2xddJO^(4V$ C!JwA ˏՂG(eyÞYWuO!\ ֨u҉(P0myA2f$c#2mn&3 $9K=l1~e׋ c1v 矣," RZX  Cb`|+GN_VEbcjb?/QfOsaZfa~$俺jǛ[!x4F8!|iL.PW kSZ9B+Դ.9Va_;o/fͅwb$c1g߃QNNzn$F?_dA>:ywjGAHHbGyaX0LX>##8W2*-jGëDO f=oG?dۨmJuϓ9/(R#k:UN%ם FOq9ᷟ>?柟o2}/?xZ8Q1`$8#1yMD8|h6C.-ƥGn(¯rh~[+oGWj#o\k(+m; EF6?ozNYDFa@gJ>Ҫ5q/?O/O2.A%U'`;븵H l)sApR C~S"oO,2~;7s4 &!!&|&HE. {}"Cu:٫wݝo>MgsOy-X;_ƫr]zK xDRqP:l))a ?E7ݐ5ʒ4}PCzT"ϣlh({jA`er6 eER͕ʢZQD⯼\?~%0JaAyFgO8e_~NxCCi^P<dd4ɡ~ FBuzy{ݙ?l @]nt@/Z3غjC}7-¿_A8OqmZf'G1Nίя'!5&T UͫAn'N?_$^9'W XBwO1/6GnDE-j~*_QT:- LQ =Š*B{?usO9ȇQùhHġMUmn'y1-ཅU&dC#uT}M,_7w~^ F/j~8nwfѴqݧE."+6 r dAjv$9O?+3J3AKx*r&2"uh߇_vT'yDw`Ŵ=q!`vZ\*IpנχqrԾ0k.&O -A8m&x4L y (sy^1G}$11MY0w }9ݔө>ya?w=s~Ez*M '{$DTy3)%2P)RDd&4I%Q)yR;y-r/^`ĠF)@*T5hP8e;eБ!Q  ׎$AqN#wP?z'Ӳyws-y9tnsowlPwfoS[$ -Eh._o@P#01ƤXP\Ҥur H-TʼpJ'<ߛaڰ2"b )dCRh@K9:B{UR\üO3~{Ip>湞KdD I1d2B @r}{sOg!빾3KaYWMp]D(ڳ ĆyAdGRB<3ph3tw. 7%&"dRZ8HP3(|ϳyp\%{5:O V'QΪ=])b*/-?m@UXPsmօ\|sEZE mn Bl,r(7K.Ig#B\itVuqqJ#4O<~i$ʈxܖ#mԚ4ӟ Ki%4iϛ;zl gfz5ųn)W)/F5iʏL;1 #{ EcޱhzGmvp>|mfichKZ&iu^$ 1oc62Ӫh-f{n.Kn&8Gyhg_}.vs}L% <%' sKMBdZ2!0ޥHmT$E[K>N=_PGh$߀p&flCͤq((4r/ȘOpaD$:-Q C] n^^>{ x 0${iv˪[k}Xs2xRL0yG(eajRJxA$9h)Rt%>_( L'2f_=g,&hy5)` uK;5QiBqޱ1d22 cЩI2<9 *ÕqP(I!F PX`c`cS3r6U(V/^n KLydBH)ŚDP̳u!u+/~Md8pNȵ RƋD-j@(`MT vS*W -lP+.h Gbs Ou>VpirDB'D\ D3ɰlխ3A9|)_6/3QqO(іBr:m!)q5+QXGT Ldy)ylJN ̈́okkQbs0>O8РRIU<\PvU*KY ^D!Ri֥JTftRЀW/ke-t;H˨`SnCCh QZ (,jex>n9~68& ?ygLDI`Q>9ڎ@glT&ŖƼ[ڜ xJ0"՛#m}-IG(i^JEὢUOuW|vGgmd6Y^al}`{fޘƜsR 1s]vWL֒`$x,ݹ؝Sz:s[:v<%GNxGt9j CM%RFqmE]g9ȝA6L}q %mϜy9 Mӟ U\Š iJ1H-a,Eƹj "]pZzRA<=8Zø;pasҵˊ ,Nt,^4|>,!֌6}V` X*d)U[M9/tcni׏tUzgcsIKj\11E`C`aZ0Q&M-s3Nz ?]~M;T8珍+1]n]Jݬy`ޞ_ﯯrh|kG FohfrMSCZKGQ^tvYKzs|&OM9?\4?ڥd#ϳ2/$.F˜2b٨(G08ש[\wΛKf7FV$/4di O[o.6ˇ|X~$rz?Ţa ?ۣho/FȽ_գ%"\|9?],Gt򉎪ޑ{J&#ƣog_ίLG+o>jG1=_ {v14uCLmx%kD?g/*HWu)i5wCyy#4G濟XsvN AgT|:ELt1\|_GT041!AqCi*+8i=2''5&$5:=KȔXI[S0 0M)1 %REDJPylA  !*9E@dUN-*h?IfΆ\B攅Nz?:`LZ& R5CTKJ. ˯N!GB'Pcf[[h*H$^+#[{uטeǩqBK)oTuC׍;86Y=y.>y1nӵoNҵ(81T~7b*S(k4D*cTlJT6_lz@$0'8ò+₯T"J\=C֗IV%_+&Ł3 bR9Hu**a{yƶXh:cpXxeQl_3^#<8և<ާȫѶ? C:99zqq[J/0A(ڐvXI`Z:ևjeJR +b`*ch^*Z"mtM|vHױ* jU3!yH1';Em8`hcdgű1&MAum֒f,)̕N)X@ɐB l Vi!;!QB,EeI٧8I}DspƩ b7mgD3"Έ`cy Rщ#;38%+$mPb* 1)or߱<;{ܘ-y0ţ gPȎ"%bGu>uFnl0WMa\S%댋nxT@VLҹ6J<:)0p`'V5Um̌ũa7}g< Z|[\q|t7 ';A0QEՏ{~+??f4!QPϖ% lf1=U SMnBJGw&eG5\פY_ۏ{ZcZcJc+_zX$lCIaKD DKΙR&m3hM'+4I.+s ߏ$n8y)Lf) Gud!w5j=h-Lx6Lj'{,'{lՃ+1 \5s=j֢:\5+•W+qSxuu9 o^wg ]%[ A:;qHL1rmzoOgg?Um?~ Ao6 ;+OI$o52(I~;>g2@9_EVd2L}3ha*gx MjU-9,Gi>Cx!ħe,YDdA5zt?gN ]fNYJv/%6NkNFOVoxNN1|+"RI䚦Yn4 Mir 3` L.R2d`%LZ'0k%rV5*o&c*.h|$9%9F K)ըj H4GhMʭjta4ϫѤb,;C TH D!!!8pe/KF/J9&cĂ#v_Dxr֕Av67{^BdZ}Ѫ!6Y{dGk&"F}ǡ5m1x V]LYIR澽;;ܳ9w_\OhzL#8߹h;6K޹ ł%kՔQ:7Ei}tq/ܺ>c6Xg'!u6-L\37Cik-5+u^cN=tFݔgOncU1anAS-Y#9}QhG]I?,5祛kn~rr~`S791Sb*T041!AqCz>8tTƝ^'  :W4"dJt)JW v>ru!Sbv"JΣCעD ٪  2B0Tr"Ƞ*%zZ UJ$aΓvFa9=2׉PVRm[KKGZ]1ȿ-NZb$ t- Ux󱨧IAfK7?VEwlѳլa]T{ MZV B(ՀrH$.VLgTz#c7s6Uaa7 Mg,43 os]|i~{yUy|_0'/)@lLEy -KE+Cnia:_%S!V1CgfMgF9ۢ6vFmQ{fEV|,d U6k 4됝G(yMM5)unx%^\nP7W?(;3YU?>$oڡ1`&O NP5lQfQM}'7pOuK̃.k($8&dmENUHw4۱zt: C(eR*9\༱̕/=<D*( PBwOb7s SUy)2s!Q!RV}%(m,|*[c|y>2q=쭑 .|1d X%="}[eȬLп 6H >gC/ٲCUW"/D:ghsms{ژA_[a9Z9|:)k-  b@0f a|CB)aޝ;9;-cwPrwxbK H2Nh+9`WzS=S_\u6sSDt ~[M4ܸiQB s2Ț$.x:_ %E9/| ^>1EOη0k{Xt)B xϿ~]\PUph Dm]]eV1ꄔ*%I#q'o6yBw}NrO'HS8⹠Ȅ &9ʁ/&FCnFntOnnP__ Ap6 ӄ҂nNn3-̧7T|&^Q6Ć;X , }*t$Z&$;oC4_(.6?Go= OA=-ur8Ŭ(pMMB>HeMʗ=|(%q;&,ho~DU%$4l .J&\#e&JXx"D>0̸qBlGf19j['Y~|R"镍V_$q wMHιJ" y"_ .΄dB0!b=\I@%$pF/LhAFbIRhQ;n 0ɂ (@7"Cc"84դV|ȒZwu*|IIOw 0\y+!BN㟌 csicl;]0UAUg«S` Z7'j?ˠUGǓ@`_*7Woy©GAe8:9o(VI<]8o8nWp=KbcEə$ 9w8ܘP; <%25ʢA]k$VAhx$/&- R74w>*LTo+oj \T;d4mF2\ OTXEF! ջ]GzAiy)ԥg[;&()9ɧ]Tl7>iaiD5v˶ة ϋAKO-@YZq㬁K-H v(6PC%G\Utn?<]-[2}?zYmηR}ݟ8%UGyl1@ˈlqmj̣C궜,E$9$$PD@ABG$˩бwn!^}Q Y:V0%V9^ï""1ţ} `:>a4Bru\5+3nn;Gq3쟖+Pnt>}>LpUy%NJ )V4ID 2T sr{+gP ^(JD5hn2D0HvT')tHđ6r;'O{46S+uatnDVoA;鸉1&U'WsiKnʬwxƓهЃVky@H9%4T4!ِRx>^t. 6s ڤ8Ys|RϤAD!H$I1L&8m≈+=ZcڃpYrw&Kz/-º-Zdv_p`hSOE][vC+LvPYAIU WS΃)- V@F$ EP,8v,E՛C~w 7|KO{{[T^ƨoQKI*8nkB垾z21>CQz6M=ruj3IJޏdmt:9/hSBq!?m{|+Y/ۦﵶB[MƓ/ }7֨i8s(:mF['x=]!>ï]W{՗qXZBjBLiyeSUQxU-7Bbqcjs^yR49r_)]dG/t?--(1|4MOh ՀIvu@_s'mzYaᆣ}գk;_LA DŽAҴZ&zZȆFa_߰$ӭeLt8qgHV4ࢗZb"MqR6\ .c:qO/,~H:髷 Geb+L4DJ y \((4J1IݡS(^[toľ'yϔ7-e1AL.?5rrrEo9lMdBB%$d @ )Qx\&T@E~QN4 \&'ud,Ce`Dn(>RBFW:4wB;cg7:,orDW'͹A)u)O>:IFKKbЌx*E ѝx,Xw_.E::[!&ߕt>ucB8(,/>eq\Rv$&l/RUcuΝ#NwF?UzaV*$y}3_ b%~O0'gۜ8v8㸭yOmS78.KQ.J$aYEG-ZYS>z2bꦊfAF ?Gx.RMZ](i[x$/&U-`[G [>͠N3/M~;?57h^TsupW^ѴR7x2bh{p-qvjqwɠF?^Wo>6rQ\ThEԠ Θ)մik ̧>^3N/'w/ Hx!Mh,eRȵ cJwD=F9uʈ!5LhJ qǙsgՎ1F!vA!wmDwΞ%Nໍny-:<ɯ+45[ /0t$P“HLNG.>Esm"<|S@ponh`bBeɫGo3zxĉVa4}8hrx!yHJ&P)Ƈ7l/aZA+NMmͷ&*cF'nUr#eeѷ5"yI E5HeS׶>6Mnǯ{-6.ń%"< bd#I3/†[cB,`bi AhE=7\͌RCxL8 ut};rt) UU;M!w[ZbBUt#N9\ΰbU!OOQp_nrrS1VzAOf5_VSZJ#or0IU'$fv] 6>j~RTXGm9)yEߊ41B4#`3@c {< ^[~HjX"/ig?$ѕcRa46}ZZn2oꕼӷziwzv~yښz@t55H6o ojFgˋ˳P9C Sð]0&2O^a[qMH 67{fzO~?xZo.=˝c-Ol$>o >i,/"4Ռ7Ptr}ыg/eNT>m&o{mym׃S=mH1#g@Iȯj@Wg0l^&6D5/ǃ0 XMGvlw*GtMˋ;oa>ʁhGl J=9p _xwoMn<ߍ)|rYJSj]hL!—&K uwa .F/춛;oZl& FMݵٍo 倩#&//~7ϖ[#;( ʡ+QwϷZ7`'Po Buo zkil8#\`5tsH]}7$q3Nd+lE>$W\pEj:HecU= 㪝`wl歹\alVJs\Sidp-p=ZHf H]t ;W(X0 H.WVʮTjq% W4%m6"R+R;$ "pEW$d+R li:\ifi {fZ-+Ri)2Ad>Awe]I l8or2ަe>:eVgf -h{r4H'ƓlBQ-"xvp,z>d㪝`{}]>Vjv*x \Ww-z WȪW$ש\pjb]Ԧ }eF"Vg+"vWR)Jp݅VcWV@qE*quqP H`v3H*uSĕv `]\!sڛkWΧJ exN ~wP"6ATkARSĕPugwlZZ}O`j-~Z~JяNd)\xΝ'')CͥÛuy+34 G?6\ 24 bIpB3ErWȤDp%,z1ds|\L~qqN.U;BFO-p%z\ݵA1LF"e+kM.B xqE*:A\q5@F"Fe+k].BٮTrq%Bg+l+Rˡ" "B+,W$g$uWRW'+.#\I.k5\;:E\i-ilpEru63v~T3x2펟i._x.<&35B-bϥAuX4(Zrc/䌎hUBds OG21>RF@2̭+hZ[r$:)k!qrRqSY휵"^3B@_ ΏJM"eΊp,0wUfZԪM~Z•ܳY ']<G S௝J1\=Z`@B+ZlpErK+XcWT q7\g+l\6"N+TktWk{ z\ V xNP*u\J{\ ̓Z+l,d+kY6" :\iDFB\lpEru6cWt~T~qepf+$ ATi @7ZΜ'%\iR6&]|{THlT&YTjdtVtZfqCsIJbRpcavyՕq0 Z8AՂ'gx𖢾ܲ/ϱ\O^֟^v]{MG>FojXLh+O֡9Y<``BqX|6mPnmס㘦j{ҾtTIϱ /V!a4[-[PBɢڻ4WWϣ{ܨyi^̌_œm_1uNAH*c%_lbn,9TI /Ǵ;nzU,%X(RYZ*E(t'NZN"%%*P,(m|4*U5~@6bؽ* C_;YFCюNeN\']DD!E{m1JmHݢRD#Pk^l!Xi@zcxgRb":Ȅ/ }ԖhYQiW܍J7!Xś=Ȟ){9JpKF ǨVqD!IBc`Id+S!'#ULu AkZFS7L[fbZ%5@! ,t h2ˆ##FHJb JɁA}PqO8-׮̺ƬlmH "I1DDhdgo0ϲK~?jڅP`H (W^sBr*Ԍ 45܋ zkFzsªRڼ)z 9HJJZFq61,{n ĝ~b@_uu?siq{QF_ GqbZ|\I&e4$97Ҩ0܅_lſjX>}.uƚoHs<〬Jx>(~Y)Vt1P2,.8:gG'ZIe8rY:oG9 4Ul%9:zsKS5Iɵ ƚd>5i>]E@q:8ҦM"C?osW fhe__-x~v~QK4y<͎Nhe7&ؑdLmit)4tƜփE^n_O~6_g[QrQ9&}hxrX[ivAB[mEnZQk ֦}}$# #ڇu1Qe&Gߖ =sx669>`dۨmsIQ:Kb?NuRԋɯ<(c~e7X&yK? Njǃ0'.̧Ӈޑ<R Gm$X|4 >n yhB#] M-vZw9fw.2 L^ͷ6ha[k,Loԣ\iMtq= Ml~J.j~NQYE! 0 vYL&߈0FD^$xvS^9ZzsȒ$T 0^v8,E8YEv++o,<rL tFSoѡ&~_ܥ0HMYɹAv'^ICNw^|εI-yJ-Cw㭟V_;IA&+ÔApLh+Yiz56;Dc|ȎۃvN&Mum[Dž A*Q.2hw!iGj3K:`Ϳ51&y$sۨ$("\fkR Rd1Lꌜa_Eَ$]2)'D*J8/RmfF!Sk*>/*$rۭZ٬;IΆ_үˏr^w땇`y(1V<71`$F&8ɞ+ꝳV 65<7P-w`כA7c=f-oɾ$ J\~r%]l&rsPGR>eL,5;r쒹ѩ|wܻ_sWLþ+ lp9.Ƶ$jnV~̰k)O$%A8cA:2DP_=r/k/[{_Me?EOӹX\9gKj? r J&- NdSIkIˬe/"^>˽`x;% O PiL\#x !4*$5CxLLMVhz'~$h92#$NwNmﭖni 4'/ 0'XnRZD-d2*C$,Ǯd:?=h=ymG+r.y JQhDdpŬ EǥĽK $e/y:<0*d=A3KRYBJ0b9ƫlc%2H;=9#R*%C=w`fMh³.CCwu/PԵe7wH ߃ȎRB0ph{~$|NW.[1N/-YWB)'322/bDQJ&4;r ˜~?|?L ݸHIfOR& CQSm`jIr\oC0qkf1J4fz{٭6lh&avo{I8mG5NJw[bWo[nn]|2H#7zq8Mum[ߚc:aQnBÇ8mXu?KQG E\GZ"b5P $!6m*~V7Gh5y(y0XoX[% &F*o j][ΒnM:r3g9rua="hԱ t݄o$eϑk1fft-mZ i*ӿeafNLBcBQcwj6ftΧ/ ml{&)2A5ޫ;:6q7mz,d&fVwQؑv@K{[ƚI]‡H#YOsp|⹺kTݞc s9{eOA0JhLx|N%Z92Y1D::qY,6G؜:8d簓T|B_Pȉ{QBL,yjΓml\^7BOJx`5qQGlƒU"nuӬ55CRyGOL)qUq{T$"dZ9y9X+˜0YYUSAzDItKu;>xK9`ɰ%2v U6ciWR[Y)ÐZN (ѠKQjg U&xxŤń > ub>EfٹlE.#[SR`:jo]aPۤe`I!q@HNNx%&ZL;==%MӢJRClA[B۝A;܁{? $٨K3_|HR¨ OV$=$S[J{CRv֗9&{yۢVGoipU T(ݧhR~\`Nz$SHv4l~s^Wx7O.-im.}qA8fM==X|PҮkQqn+gjC/,F7ԭTܹU?5|z.UJPW&i#>3wpD˛;2&De"L30`!ȨiPDdX/͢UK],Jґx2+gYQZ,Ym~(RZ`驀Y肈*>u Ɉ V;g*/c62)Mdܠ31hL'c錜2ȡ¼uf 36%81A)ylΞrHT2̲f9<  gI%34*Dh rA\=g]TCⷪcɵZ3xM ku{MvJ"®l:-1L9/v!XS[K]_[L`mmX̅SZ&aU*;&&0DY*#-V| S.?Lx Laƒhy R܂"3ɥh,eKX2d,J)c%b0WKe"S_ç=_2i+~3Je0,z z iehYt`~aN`*qf+VY$Nw:$֤ 1DL3$t˸B]F GɷW.~5]'m|g_4ϣZ,ϫ,zzPH4pIW_]xץ'ua3"*bJqcw mnշ^z(}FDmASr))anS9#&.2jYp錘2$b-dNWKG{agT2svHS :Asٯ*W;&4[>…, ېd6@K^~ *'#$sȕk l6_Qp꼵/m hLOo`2RY7mP<'aξ?X9ƖHuO\Wt(}xy5 7'Z̖YRjkiy0XeiȤ:JkOi=}apq\&h݊2:ǘg%@3pAГ F꘭NǹD F٢5Qc\в4cW,:,<(JTmr]ƣӓ y/0+mO/e<JMq*EYJlP)X$W+M$'"6HP=tPF-lԙ|;flhZC,ܯ 7b7&fA\p1[ݔv0jc]ۢI'hRuT @$&H#k$8InF "CL&4PR F٢EG ̸ڴ٘슋a\T.vNcji ٔWd! z&$.1[tVApy+H T1ynq0>HuySz}%ox(h$eZo6G1{<ĕo g̬y$_O1ukZ~]&kyxէEqWz<:|*qtiKiYD^_\LF' Ŋ+ @6`A? oKclJd}o6-`;Ob PNDb>fuAFvך'tJW'\ɴeLّX.ПI~t5%ϥtaP:jrV oՂ?Q%fCӧ̺vFTzi=\쪸0YUIn"?HOն%1R%W(B֖C7'ZC2IrŭY?t!s' } ӡ+ru5z4Ao1plλʹuaM.޳ȋh8\3woUf:.]Z}5iy3[swm@hagYtٙloI\/9ޔ|\\\vWbG8u^7ˣkuў;>#d``xq|YH$\Y (@s{qВѢъCH *;,&+6Dً\T'eLzl^oK͹\Wr4NFo{}V[v -:Xżs:LPͦ , enLL1G%[Rk/}i4x)$5Zp lȞ@* u%@gI&Z !GFkbclGy7`e1;wmtzܧ|+Chܳ?L~Q#hT.9]1CY\HDN+s!Bb]Hh -r! VN0@})&tKrG1cYG 蚍= #qbw`'GΚ >C=jʡSn"Eu?nME).r9`jN g9 dz1ut礱ŸMXXOc ]Ĝt|`qlxJ֦W𲺺thQ+ F>6@dbtUUhyx ;"{DZGdwC2 ”&Bh揊+ҁ<%1jAwBw! د߇,A\2}3X=c&n;:ƭKw$"V,ّn:h a+p kQ0kˉ}h Nx="!wDlķz\X7`:8e걹|GuDc7ռMUE]Yju;]_wU # :.?CoSAEC6t\5AEJkWt` pu=_~콱D?vi n@@^[)fr 7cߝh} 嶓Iv\iRNt;K>48ʎi^CTH&˼2*/y~I[XXcƖCyA1Əȣ!3}6qAGSUhvkhPSЅ]}oy#tU!:8?}To6MVcL-b|7NFKz8ȊC6=nx{ˍR 9ݧwoVp<".zoA&qM;Ik-0]\Ly`EP`H`-x4HEHi}oԖ^eH8\=I`# S5ٗ'I+}zZVKH=TW>z.9h8""Qأ"KItHiWWE3+^߻rqQK:Y |R2q.KltU܇y\67?}7rbk5̹4,y> 2?lc/d$wW=ЂOX!E\#ZWQ#k∦lX)VSm7B$a6c2h?d;qg+-* 7(?v7ZQa1vUΞŐ] iuTR"GSc+̡)ig,^eF)DJ4f;O' Z$DDNiH?9c`hs /V6.CEkvF^h\bwWI+t3*6=oWUTہɵ&q1[ [rXID^g2s9$>vۘkD|{ieC h,[y%D&HK*m >!!TiďUz&W3Q$- 3I|,4lgv0+5JÖ1LxJ %S"ڒUm)u2蘉y֙@&p&{ B `Z\ >CHq@GC8^l$Y7p S">lk_P9hk Cf5=]U_UWW yԈ[uTI"[dtZA-7wvRg'͘zs@%ʊA* 35JE, N!X&G+c&ӛ=FANnL70u[uu44H;뀄.=9l_ZNrG-x9J٩"*gb`mW)43OTHrMN\I6R 1HQbqg0Y{qP)|.E邼w+k>G;ao=3N V` $ SDNnɥbNxmd<!x}:տaZ-6` :a9Ta=9&.\i<`+gmZ%N2SnL!HY}UUWHyRK%4l;q[!<A'!PF_SWBb<[zuV6O5pSRy_x=\U,5䌘k7U9TvAKm4 +3Mn? ^1Fb :Gn4CT0#0a!`,&M>o]&6JQlYPF \40n8*mO b<gòDײ:bCzvz7W_>^}o?^`.Af`(&4ߛ_#@A͏[ hazh6C6-ŨGnHD[*ւ(=~~}=?.໭]2ךeiAg l~='. ޭRQ!vMۘEZlaÍ62xDj'n'T62r+ #Zc138l!R[n;;;'}EyF,cQ< xH:8N˳CS봲sT$uݝo>LgSXՅHԔ1тFQ4G!eqkl5`~-1AeT߳ͤny+S/EVg{mz_t/_^::K1FS,"HP4$@`-g%$A$BZǗ G00A;`\HX"RLT Z XEUHkő6`G$; 8"ǀv;Zl5Pwjq ~IL\T3e:% G <(eyU,Qp+QOxZ8- _T$\L~?CYe,'UHJ') :pTJ Ew /7Mydt{ZLOFWMJeTX 7x-1 `Vwϲ̯2Vʂ] n/TjmyB90\S\׉Mx¾´xPSb`'3P$}9t"˙Z7s"D@(2 ʤ0^= ΤUI?gH|=[x ِg-ɓQ1A_n( Xɧ )=~u`*ᢄ@QK\h}l06{z܌z+aZ5<0InKeb TyC4'ΠߎyD.oon7 Tb.v/xjm;sq͊g~&؜ DiVhC(QH-2gc&H.pd\s:>O|B3<ǝOQa&>9%`^.B.:Df`ZJn,\ v^Z.^Gcr U [*#,yaiȈs͕>M92­=l&qI|uE6NPa dx!18Q B dE\#'GDݻ'c(:vo5?^<EVrlQ`PIlV\rA ƙ02ww.:k- 1<#aΩFpA AhH ؀nBP~!f0Gjd>9HysQ lre@kl mE }'UMSpGu1ӥVU7$us}숓rl-0iT:zt0 Òxep.XLX>I˔)w(y;'MH$] e!׶X)UKa<ŃBQYN$@*d\[Y/WaM]R騬EI-&sU/Hӫ//@D$+`"ET0oN.[˝9f juU{Ԃ]SUiRQ~>>-Ɨ:e|W7B8<171%s}: C=q֩"R %z"YsPZLvacɸ)Ml| x$ )[X1cN\I#O81, `gaa'd_ŖdˎuLE-V[U"v B7^k&Ξ[!Or=aaܕ#lXGOF dR!PK_Kӯ]G{@.#7NLY+w 22#"WA2m<8 5wٔhi# =|KRg3=9|s^7D2,ǬH']rʲ̣YRp:[?VnMQyw8#vhH>хj3W>],sWLQ⋐rcb_v1/B7G\ ,;KN/LI;Y\r~hQI-"ѨL@T3یd:9J2֑Ts-E*oSڨI)yk77&r68@D #>.D)R- ݚym1Kcyu1Q?Y݁-E${$=SVDŽAxOlbMkB=#uek'㵺Nj q6 @p3tݒ, 'G }(Wdl y׀9q9 2)Yvc$Myqџչdys8#CME[8 t![ǥ&oISؔ-(<9A=ͬPPts.ۢ_1?rg-OAbt dQftYZn7@2RC40s$! gyiқCto.cngom`wa݃u"Yh MHf㛼JDY֊-ӏUDTO*k7  *8V "ldbootI>k4+CI&D L6] ($dqJ:g,O><yƽ~'{ms̒R\|[1qvx_,zd3@NC z sgr 4]™20"4s>&s(@^Hg=kx_C;K6Aeu[lu"%+5gZоͧ_/w=I50)2,:1C"TtxՑaϛ55^4SK>JQ0mFOx@CT.E@2 vq * @Ef !U.$Rj^;v4ϯJ=X,pO,?8YyvxrWy 7T@.,w| j$ܾ1|,:By[ ?;[&װɍef%5bdQL?~ho <+A{{FhC- VR@dPPكر*^Yake!.x'jq:Q#x1>A!sǹ<5X #B+@ɇK/nOsΥi# FOZ, BLI75 2 Mm \.& B#"C"d#L9]<7g߲Oګy^1WXzSv8k&r?bt_*w}E^~ໞw ;U`gϦ$+]ϱ]*-WRj>T|'G »n?c`xތT]d4J| (3I;}*!_gYd8*#Jኔ Efb\\02R)>J/:D1]TQ F.Kk糙8χ(ԧ=s7g3|wۢlFV[ \םqMODZK"\}#o@ul,YrY E^K9R@>"~钸w!hDQK GB@BhvD;6 ?oٷ[ `GRc9V7e߾-XsdEE2:s:c.DyE`u^(K{zGM0&Frڔ5=qO_t %[}Za \) Ȑ) | IT&EgFp`pZ y*CJbɕJ0O+I$gE?^C'q9Sp_:\xԱBBxrMNen3lz9U7W]ur*yU?4y?) z!sbdca6s`*' T6A??ߓYR<~důDfxX>eg;Ob(Hf3"srbj}kG~'9׭|r$#7REtGI+1WRtBm8d(wۘtf!cNו,J+O#*i~kL_eRI̲SD8Ho&ӯ&^#5]ϑ90xC۽{\Vo6$y}NJ+[lT ݼo!kKG<[EմQ ltK6]f4:-Fo<=lEoEiv\)]jxgOwڪ[a:oǻ{G ksSw\x.uzK[\jmݾ#Wp2j[Qػ :̰v6)9?Y˃ϗ'FJ\E$S C+9aSa%'=H#D/VlfR ^O E ^ xTR\F_x¨xF痚,:^^ m/:аcO>OU=3FUxkSꃂ 29^$SQjnfPC)SmF†T&%zIo+!C(6 [cPQKOFU"-8-Q츺eScgk_x`;-~Yȳ$>%t-&k\:N=CB+L4D]:M}wbG)b߅BzKjٝr1MheUf9]M>;~RAr[x/f_GdB-vT!/X0 3U+REc &YY[YfG]:xW"wJ/׋xuAs98ujںo8?#PFs6*q} U5v3QH`p6pUų*=HJ)WFӪ l8 u.pUUgW$bWfWorI/"?M\{bv$i%]=MJ3hWfC_=X9#k8 wR@זok]2b⧳tM˾i3~_#f o*P̮ Nd y%R Na_ `Lr&̖qMbxv7E6:2 Z2gg+Y6'W3CSxROAl΃i"S0ڐvsP6)+eyEZJ(x2޵6+bkNR 86A"'y9łU$gȎd{v =ŖdK,Sv{,Uͮ*~.YA4@4!0|1o*nk kā;o|PIˏWyq%VCU[sp1GmW{d#6Ƀq痍v焩uvNRuD f`;:vĀ 5mٝs\A;bN^@ ѽB[f[AK 6V)j>cPK5(Qp"kEB t ɠ3~l29me|b]A`61Y %jCDtlϏ7W̷vنGB+)+U!$YIVizd1H[ ?YwE^`^~g ݻk[dJo첄Сe2`DX( w]gu6ּ6PglU_#KnC~u^=ջm.cL𙴡L\l 5)M" 05:tI[c3wNn1{x_sވfEgOoXS'v)o$l{_O-$ʾV@|W||Vv\֏|`y1<,n_־s}ߣ-b&u8J۹icyуW 3ٕlP/ws0w!)5raS!D^*xHwzu%ƨ`A,dEcva ).e-j/.C]+ *ڰ؎H념RV7xiNj/lϺZ\"E%2.aRȄe}ݦMu.I`U(NI[s X|:$#&ڞ PN}}w-.{Hk>^O㙧DzyOTd#Ef J IrXI򫥭gG jl%P6k~od}rT,Td#qJs m-x(̓ 8 5 ݔai>lw>fzrd**iutuk|>cWޟXC5Z&CZabrɄkkɺ4ֻRA5z]O&O޸TTKr(XIav4X_B=“;]TdfHyڳ?fhy't+*KI@>&;A_QI +|ZvL RA5< ŤFTZU# h!9+EAl\= dL/'ib&v뵡׆kvgh=@ -Z,j@:]k8sVT3Rm\c]$F3dFפ,Ȫ2y07 :U!l'b&X$PM%hs@ݟo5_u ^g3-9/~ю~q4^DA䕿X__FG_ = b˹bREi=Sšfq?t;ݥZnk:Ƚ1` V=u0> 1uPV*qw'=~ {<%O;4&9(È/}A7>i2{ߐn[%@hۡfyj(Q] ;Q]%%A$[@֜FS #&b@ O%LCLP}s6A졻5=$(3RQl;`us-3<׳5w|K1BN+N\O&(f0axIfȥ6^~u\k0HYNPY I㈘֕bwᔁ,Pd NMhE:HDŽֶ *X< &˒I*ЌnHS(h :K5CI*Rqs<0U~{Ȇ|gcMkg]ak.g}lJƝNPX77W1AT:Nbg!;MF1-pMA?^ޮ$)6_6~ _Ӳg{Xl`.U4B~!߯ nsm2?zq%Y+ z2gBLb/tT@(].o' egdèǜ+Y48^b%k=۪=&Da(Y6ύyH*%@e8t{!w6dӽuWҚCQ?x69Q~t=V8Y,dWaM-7d, iC2f g j,/9J2dud*$Z3k&~ ވ;la^}qs)Tr; 1ˠi!ջbvg?IuK.:!uP,/L!EgzBB)QHF٨D$j,V(WBeI 0k$1"c #ĶDaQdt@!5Tڤ0Am>,yzV>CGx+׳\unSyc ck-c[XƮ%KUЌSxu:׳ ZrȷUE4X_EO_o7?w(?&'K?g7$qAgߟ/ON,Ӥ4: ͒_9C^ؖ8wx컷s|ݟ ?PuKf9Nmũq 4>oI?-,}Ct^z߰g??C_l>őXިѳyqHt2[FM.~g󭅛iW4 +!ߟ6΃^M/o/O ;H{Pzy]]Iۺ" bSvQk8UU4zp;U!C3ޭ|mBT͟A1ISbA1'wPȍnGt{|ZqKb;4}v,F:!C̚ 3je-9nzSޕ ΧҶd-5$$ɤDU>Q0,mCi 'ҪDZ'7gߡY-w@u_1WX l㕓쫭zL[jwu[ƭjHܕBp!u󝖧rR:dȨ|bu9)flbxOJFT Ԗ݊H$a: iKAD+%m&@`$Zɚ$E2 Q=mO|¡#Ƃ-h}:3΀Sdg!Z%l&~Y}xs\䭥L/a|+#|Ҷ,Ďܼz&T5_/7TŻj$L%DAy`lZYzꤏ$̪xif=$XhgoҺt` loKDTasKZ<,R6kJ+P8 (ٹ %RxDvZZĠY3qޮ>f+]ʇÖ1ݩFd)2FXږe6F; ѐW.* QZPGgo񢜠)p̰=7%4&2?.m J)$&Ol"`&+x9ET(չ ڔ,&ջ}Jg4 @ sr\HA ATza+5TjDBZr`^Rd#OUQĎVyQ>Etc 9eX<"rpNSw`"A1&x~:5e Q[lBP@Rj+X;)NO48Ώc43(\*0%#W VMDGGl_M%0n ZMFE.!$k DH,Ts<ήͩnU͓{ n_7ogӳel0*9|ppzV-Ջ]r~%F>:}wfFPHli7Gb|HmÐajfY>!0<*GëfŘQ {fm |"' 9b<gu>Vw* ߫ugO'}?O'w}w[: ͷ[pk~C60^34Ule|q1r-"bA;psZOoGAu 2aj)cXy=2©hAbJ2HZc=vB-ֳ)P-w`<^ [확ZlUd]'oE>V r3wjj94LZY`/gvT'yDw?q{P\*,J]SQarkjj̴-DYSq>5޴UQ4h"B!1F:k$B$ F1R~) S>E)IU } n |6> ~XՋR?b4^>,ut1m9`#c*\Y E,=DhH$ Z4Jspg /10A;`\HXFj^ Qeޑ։#mdGoZq6E.ɻ.vBw{+y[ H TT^sL$^RRȽP9L{~Wg<^he==8QĘR\(ƨFJω#0(A s lM2"9OW OCzq>1xTOB$("j_cql{JSհfK H6-Yr=:,t&꧓a%57U-o!Q7$П٥)@Ih8hƷxc0-"&ѽ9gvUNcqA׿yik&M!R3RӴ9Mi#;qEN8w(~41ُ_q\ӧ C.V7[7+UoUg@cŎaJӿca Pv BmZ.; MͥF=}^o6p1|j{nch4W[nu?zCôΠ_x**4QEE2ꎁSӭeɐO@t=3zfTa<+3ݩ"H4h& | J ;1Yl ɥkpd\t(/,"H|{Aup~L!RDX OqI="DCi V1r =% t4߫IƆHǵI; Wm֛7 t%zW 47&"m 'b^wx򼣲mgVNFl4F(l9חmno)۶,/lfaf t)=+sX)Β H!EԊD9L}CQ{PCJb #52(B;XNxMwvjH:^ӢPjm$ \" @TV ǁ\(0ITcKVAR S:R+4aTpmq@S!T\q? fRrőpaK-P!;P!:$-Ic(# ;^-BP:DW7*^j)>",Ǔ? Ͽc)fG\_>dJJ.0(!)tZJtV k=o]z^UWk<(s$?67»UZ7ko[y,`^x7RtxGߔYQt0jh6y]7dTT1fU,K7[{_8,%.q,N.](I˔ ))l(y*;O0{%i % ܑ5|Uzp7{ޙ,O1]'pkm졩 girUp8 ]Q6vI<{>qYvװ%tڶm}7>Ɨ@uɼKwUGv;j^m=ԒN)J{iT*0=xWO*V%gĮB?vU蹰JgWBgW/]a,BΦa0#o_ڴ3dZE9,'2UҰQ}ͼ4/b< ͜Ju(p!W2:?&4]IWLI y6T+5ѡ;zon!orqw78Ŝ9s˭M:F j+%^4L^z-_Ws4V0#)6gF\Dcr 4eT/E,y(ua1QnT:Մnt:2ď8=B4O{:US;7DXgHRO.zy+$yN Gte%p|.@ fҾ[r J,zKZrvٰ|6*AuϮ^"T*î\ T'(YE Ms^2,j: YYqٻݸ+Luu0gy ^Z&!=@=kw7-GcnUU^ZUZ9r-Cҙ\_> .^sR4]>[]^;\$M}9rv#t\O~Ǖ5?{A O- `J☋_,»߲B]n%mkN"oFs;¸#>bdXQda~DL oۥ]|'^Nǩd,IZϷh7 hCwD*S dKڠ-h:͉IhS\߾Ӎ~1j+0HIA5%E=D]ܕ)aN",=~X 8 91ZsUWUiԒϯ=g"Z#jȇYoM- m*F)]xsaqg@0#,g{P` tB)1Qx[aգ-19mnmVܢܴR'k֬UVm(̤L BVcږtzG&&÷P3?xFp-4p^ kB[: VAP4kMyQ D #Wfυiq#0Q%#p45PzB%aPJ↑ttk oh J`H~֘-jUeҘ"&b9R Y 5f") %;)Z͘-, J+e?1 ՠL-߭~-n9?:u *!V *5Vp#$5Eqg5BnNPU.|GjLnxCfm6oo_t??ךhIN 1h}N 4Z@/ Dt'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q: pH+h)tdԱ^ҕ!kEW 8Ía)te6ۏ;]1c{ tЕ%|Z]usK+FKQzHW!Z]5QW צQw銼(h pc4K+VWW@i( ]@އ#p< 𳯎}ܰb]y>]@B;?/NP}E|VW.WOkW|v~W_Z2ocS 4[/ P+kmY*6V|y“F/2Sr;ȩҵ0ݬUmUEe'>I\E~VۡZE*΍(~j?`EDv4jW]rUUͼe3%5Q=w蝉0vnkݮ޻ᛰo~ꛛW= yi݂>=ق~OP=o!F[)yDs$|{ܗ۫ ߭M}dSoɏ v} ;[5~3irÚ|?:n ~'.Oޟ^9^oaN{2~<8ʈ9㎾!~ܳm箘;o~ö=F)jϼmLw[RscN ǁ)MZq0gQܾc~~{KmZaٶ6l;NקD9qM?Hkm9 7}}rϱxtҸ'~>|?kg~;6?߯3۲Zr#wxmHky y^_~3fѶU޵572I KRuKRJ=S^:}$ٳ3U$[b˔tm׫)t7~v_jR۫/?׮ka~~7m`x]wSo./w?ެn>Lg돿Nf?ƺc"Z],~~pdzqY"}ZZ5G5UYvhݫ;ξp? p&/< z'xyCe{2Xlx>»w6{|_W+mC%q`Iwū Hi*KLR4jb1λ *h"JD {l*(-lg"^s"pAQbYDJ޲) BU`k,I8p;^nhS6>{`S@ǶsG0#H}1j]L9PN l@V --&xU`bSC(P|̺8ſk1,TFb#Rit"kCam9lMiGLߘ^{}d8ATqr<||Ov6)߮&\Iu^j>ÁvzNv`q5[ɶ\?x J=ѐSQI-tH6i)La qѠ*M=qTDͩ0Υ@Em*d 9R0؜ 4Ngl&a44X_B5“;o*2>.bܥto8~]z1߹fViYBpيEJ|Uj<2ՊJ(T3)VNP^HR("duh &+}pH)1k챛sceZCʹX6^{D4^dJ,$)`ViCʐgQ2oOKMaƆHN2Cfdլʖ}M"#( P*:Ȥ:f=$DZQ7zGܧ69Q(ް"*c4zL(CQ60 >a*i<g |MӓEYDa6ٰ!f#3ir 8=̯/κq )LK_4_>ADo x/F X+RD]vY{! _m1TjCʹXhC{?< ;ں}ٷ<ȃ7 ~}Ƌٶy{25EkŁXjŎEE$1>kטT؜`Lq*|3mc(^Jq,3Z+yd&cix|4u>+iJwS cǷӞӶ+6I`x9cU1\ngmJg`δD|Ѻ. CGe^rSDoCpD JγTNc4SsQGGTpWK-2$cDZs8@FcG@! k AX[3O[Xּo}uŸ¼ @D"R*3LLP@2F[A"ںI!f)F(X kj G& 6l9ߟ= = 50/<3^'0x6-2):0 ك}(~F9&]݂(6ɍѹr8ѹd9szaQy+E ^oz U;z M³8O}cP  jnu<[4x~J :HR@1zYgT}T[$ @g !S z0uZ(ӓ*noo@=څGJ շ?:hcG>2w!0=jݑjLIl9u0#e9Am%f$+6ƖbX1 XbPCBHZj kT$eAn)Q-]$t#HTRDBPt![_}I2y'SVa;rG"7pS!}mqvnz|j֟MfӸZyY]~8<}vs4wA::t. ^i)5cru!糤~d1YaX#?uj6c0*a|Zl~!'J^[8<R.7M*[$Hdcߗf8/9e\GF6:|iSYΫ8|k6M5]o}|]E~~(E\^ze !TJ\#nW^Ƚͮt:TZ:gv9c3ܘᅫOϏrv`һ1w{f?/{?|?Ǎף{azugY#T \N~`幌E;\jcqGǼ2_s]0c"m /󥇓_//}H\"Ԫ JNaKg`Rl#qc$n ! "q˅]T HKGA-J2~)&r]O'K!~߲a4 mp3?_m*D.6CAp.s4HH[&弎.j;5m#$}C!v*ް}Y(]RGUuf8qeko־Fxd^12Wbhp|71tI5>mvR^ B" 0#<āSH58 8 P)BTY'aW"jBfQAfDhCd_ptQ0KIS0NT Mdf<+qQlᗇ96rk y~.tk"Ul-`c7̢ :+UBssۛ޲Cw}X(_>f BWȄ{s^LTt4jp@'vJz@V2 e_SRK((DB('"\%q/(<~/Y#boJC#_PǺǚ I=:чUR!\:HK1'% (q3q꺸1r얨s5LY&Aꚉl&]v̳7m?5EU(&gY&zo+qW+ѽ~m,Tkx+ڪkگJiد'`6֐l.\g܇.q:`ΩvIL3-cQx})YTԪ6JF2T$30&$I*zkD~b&D ARD LcDXJkLzix;oiw3GslE{fu'o{m"82Wbj{-Ct9[jwv yH%EB@d0hFJG?81aT7Ҧ$<"DE @gh5s>1Q2۶( dmx!@qNbEe{*̺mxigg8n4*U5YS61_bzL,Rζ~NE@ub)CW^|>'&^*&^:&^,Φ3*BN` CA)JEHmco6Bt~Lh>qzNUs~POjk//fwrZ0zHIK6Lo DNlh$:+z'6ĊJ!xoGmbKNHnXC^.v(vkvۑ߼QK\ߦ|ƒW$:UTRO%)|'-P&L eb/=HOެZe7z}638AA6*S٤j! aD1\Y|%jRZ !u\6/ TVA}\Lh?]$?aMO4}M/Ǘ/Q җR\.& Kۿtޕqd0dJuM2A`:^I\SG }^AQ[EZm$6]z]U_Gc# [ߕLg7>ZEO/y4Ìe)q[bi)%aʘ\LtQP*$K;u E:q̒c﹞+bDHxl!$W:8#g's}g\XY-<хᐍzQԵg7T%ܜȎ.xfGT.=%&18 v,#+N95"@-(D/Gz1p1Mi˪TgO/?dqqToI,/)K3_úUL082̅+y\e1| fSu8wJ8.ލgi_1IpV^s-#(j~,U^0%Ҁ|f_%ْI :I,;ISZ'ۙK$Z%ۆiyWan5ܦ*Ko-ȿy}C=*qu)uV*-fP %q6\\d wT~)lFioyWo᪋qSx>&)̊<9 Ĵѝo]boK6&ů[1LS7;~2 Rv(Q-r& c'iԮ,>i~ӽiG >1Ia<ḩb2@uԹa&nƼUGdy.qG$ D}@lACu- XSRH{B + VkT8^7B/Neүuh(q-M6 e_?bv7l>cpeݾJT? <'E˹Ȁ1hTY}YsVmXYBQt<Ln &WqPMQGnZoe*tI  cD%:`&'T2RրuthZ !*'P+`< ˘f ,EO /Ҝ ?ȹ2tߺMLԗA̒4&븷pcJ@[TSL"h)d=vƹ;x#Y h,b,!iSDtƃ>%TIh3Δ%2P#^K -*Ġmc>댜 }lpX^xi"8^4%%m(\% 4:h3g.Y]Zg^)E%8 uXKzOyP6dLdB,g9I{SՙO%g+H($X%g&LPM r Gs$D04B+VDp^gkb˭=B#&&jڢ8HeU]:e 7.3"7JL`I7_xD d, NlgջZ"2_W߀=(z#scL(HIDkFU(bэÿy!$z޳Gw4arq+ 5xQA,+F5ɠlD<e] @) ;i =&Eؤ>,u.ϻ2q\:Zt _~#^yf b~Q+ʞ'gB!mǑgߟq 6Geu4HLe/YSgF}%bQfyM eFvσ+[ǡu뀒[ rXi⽥9|Rץ}QT?4ZMD5w6fg=?~WTb(p~7UhADzux: 򴨭?S ӝchF׎rRIDaf 8O ٤"uFߓVD%KK,1ʕ,gwFoK;R"굕kKtD+4LjMsH18p{c89t0i5>jKS$0RҎ{a˴\/@󠸮4T`>F噵J8>JĴ a9DH.Pc+k/ h1C ϧMlL%oW10i-y BEI*H p#!#2scS;AC\P9sv33F8#kkCD &RE 3 !68~?]C?j(?S J<%T)r$l8T3 dPXCs2^a{=5#mE!S̪ \o4D/C H8Vm$yG\Eh,i8b?_FX4hmE~1*4;^v b@)$)Ju:gʍ23.\raVvSxKYG'@/*|8^L(4rz8tr C9aT+)n6s?.\qc( Dx4BM܃ ?o* &DHFE+=G7.{U\3)\YR8Q emKD(kRoojӸ|ɴ!4xJ+m/6թ[neWutQIrAbDNkhi?mJ{CL*J9)GC_\ \(Nt -*k>;w=>3U;OT|4,Ps(GEFQBdJd,7uw0df^rrm4DJR0υJhhqȹ?DB|;/|En%H/dLt@!4&.k$,ȵ116AvS )Y1J"2QCcN(+i=MT+À` hymFE;k,ԛFuI.m>ϼa:.1 /ftr.Y%)rlM~N*]EGtm$sO BqsJ*@ e'Ғu$ܛ=cSPE H}O]cm-7Vp@F.ȕvXyx@_"lHy:bk& Bu r R~k[Gc \nѣb~6#L )x p(XTJnGC(DPHF,(@ݦ{g&~@ڥ(*O\KꁩuVd?wit:ZwtV7L[co=uQ,t@ }Lw#WH-ϒTrSC7tW3m3Tk+2W\E\!]WJ{q`G*ވL"ZJ鮋LJn^\q r+$Xp7*+LyqTJڋ(Qڈ=WHboU&W}WZ w_`t'2{ frqRw]\e*yo]Dq,'dLLgﺸT ڋ(4.OiSGUrڋG?}X nS> $p*4GIGx2[ـyC5\[o Wv}{0⫔,|$}^\_ n 2^_+#^bP6pސw"-eW h|̥:.cj:dG$;޸#qemWk 8.8lĹCEhwTq1mk,r^nm83Ow.Ҏ".`4 eϭ>Wzk-~.nW͡7%5PJN߾{r%2ǓS.Et.y$?H( Q:g^kyD/ޤ+2 z6=)Ϯy %0Z;3a.6 ]]}ʊ #]10|t%+%RκPg+5Q.g]=,+}0\\tŴJ=_ NQW9ݻb`\ 5JhO]WLͺ,HWu6\RiQ%+?jr`r_6\ֹu%a>E]y@Чݧmu}K4n^ZnllaLSK˷W ;k`w__Kwf~@BȾsnn4''euጧF-DrKUDEDc.B .+|3.*BO[1κ'J&BвŮ 4-4ZsѴЎ0J='IP@b`T6\&ZmRוPYW/GWسQC;(A`ya~j%%A κڷ"銁cBt%+&u] YWԕՊ2ҕw!hOJf]MPW|A&] c<ю0JYWԕBNٕqL]Wu.ZSוPue%2ҕNwͅ+2>j:rtC6\*vLo9Ӥ-̰gAˀ4wJi]rQPj4y#7dKh)2Lٝ4 HFMd.] mH~$SYW`mNϼYFW ]1OPubt{6>Suk68av Zцkʦ+=@WzվMzIW}6@i J(g]MPWC wFW^jX 0jЄt ] ؛ *u] &+~5(ةlt%>]_ ]MQW6pbM銁_xn.dsJhO=zp] YWԕ#0銁 Jpɮ֩u%͆u,o9KruunhX XЯW+?.sEôaқW7ʥ^ ,MÒ"e]QA>79(.pqޭ7qnSշ˾iʼyB,hodH- ڗ.Eߞ>ūCCk@6j Xƛe,KV?]>U\WYُf;*8ml/sSvXZc QZj-FzM3pؗzjO,oǫE9eNJv{Q-Wk3⣙Z1ۈ85p 'WսSs]7g{~⚳?qOP0ﭜ¹^x~8_|v[S~j^/#E/ۋܝ\KTێ^(][/V G[W}!лNΖj[ |}zY]-Φ[]Y|nWg-~U.W߽SBRP̱ |;6'x3)涾Wq73]}8vb-/r@-׃cd ۮG|ܣ}T}N"vzcԣEztJ}zExڊPz :͞uM'rƜ'*ݫ'Rά//wMgǃ^-r&nA;׸}hkܯq$Ud9B[9uxVAl_Nb|k&SKz|>Ym;vCU! Yɻmfuoz__p[}+uml_Td|TP-M7x[/)l{X3ZX4&}w"ѣY&=-X'n[}},o6H愅ΦG#2(jGCXZxuyh:-fQC@ U_?R× sHy^mF82!;lV>{f۲вxanӶ~3-5@:{@{ϰ{ & !< θY6Jh5>8\(u] YWԕHW lFW+EJ]WBij4:xnK+U-^\]ӻ^(=Fqix`-~i2A-}ikWD #謞02Q!Sh?bi:sbK` ^=Gul`(%PSi:Q&g^ M mk)AϼiBzDdQ$rѕЦ+BYrՋѕL!9;6fd] 5#gWhHQn"mzЈ@J!ݱt%+k0J&+&xHWRɮ}JYWԕցStZlt%&Jh-+t~ue Xe3ҕ;FW|.bZwugוPvu5]YcCFb`w%Mv%|v%zή+g9ݻb\&ڐ|v%kVuuU؞NڂY?7h57Ý/.RC"V΂s& π0564vx\-޽Aˀr[E.1*x\͕ZPΏ&y%ѕB6ZSוP嬫gYYeW u>c<YC+=W)x]MdiǾK#7H0J-nܬ}2JpmEWB;Vv5쬫) ve+ѕl+=u%κXV:7uAbZ)곮+&X `dWn-QbJ=j@V1[킫ѕК/YWԕf+&FWkѕЎ Jf]MQW3Oi"%@^74Iq?iȣJxz ,T8W ex9d@씚QΪl; jd&XCztN84qOhVԯh7`&yEc* vt3pd)X;'/w*)i.uM %7i@:12U nia4(ubtz6}8SX;(a4r py0Zia 0@WaվM>銁lt%+u)YWحsGFW;-h+ ntӦ\d}bvR9f9ORb6Ixhh.xo޼ yp̽8pEnP%(l y~yTW:3Cچnoxy*:B7oJ]ɿw\x[n9XzqۣڢmhNxˋ/;{|sH:ǿ88?Szۯn{}.rbSER7gxWݾG/eBu*`|z|{Q-;4_3rռ̞|ݣ?!.]G^st<8| ]߭ʛs-/ݻ֝TG4 emhjpAWBӪJg"6|GB\Ggۋ;n]\ _-PuUk&BA*2uh ƫX)\a!* XBe;͵E, Z \KUmhXFƫr?綃BZPW~4ڕUe|ݨB9P=7\F*`M]\hBp'Jְ:&X|T.Qo@7Ħ#NZڠ5Tbi-U|ZackF Τl Lj{S|2s3689llF%6MUpFgTu,bTWk <lZ LEs@k*P)(*\vڰ^*mPs4aѡ0"EMed")ޅP574UVǪioJj|4q-7?P7!( OE@xkĒ"e,O,6ϲ`KR Eat2+ŗ9͝ gU ^EcVu+)#TڪTK!("*( 0\;-,zwm5EZSR ΓDɃk6vP - rR{R@s8n𠗨-ʡmM1WTs^ 7"Fh<^O:(QJj]T*22CtQ%fTi SSC꛶ƮXB?v7+Bq&SK(\sB`|?$oY^ѭdZ8 R$ 4FDf,tXLÎW=uS @ @E!:C$(jI- UX mRz!w9w 0P5Kh!w˾^Wn)nmn> )7@!VzHҺz=WD1@U@;E0OO_]f?|\brG)v fR.AfE%-]˯W_g[zqe׶7_\:.~Л.syWzټ]|)4#"HKӲ)tb}x?l6ms^v mK ;vڈ0(gJǻ*فri1O D~փkh>v'R9:SN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'_ e @*-~3cuAΊN" ; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@]'MWЙ@˓q! d! ;Glv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';(@0Sr!pBg;'@X(v='v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nxiz?W?Ͼztpq~AoKku[w?+&0lO# q2% 7S1.!Zc7.QK¸A[)b'WVJaԩE̱B2\=Crq\Q. \Q< G@$E-3+/;|AŶg봬y6߮haD_/j'q|7[-7_]ἹX ^^N3Pk9}qA] ?W ,We|=[:۫+S// Ϋ󛯂)gjH9\NV6mzBBz O9I:}* ' X~RO_ Wm˩u?Ϯã'Th (rl\v{wGL3τf~jhEk.9ߑR82NZ_M&H}J0M FO)Zu24<+a:*9!B&)\w2Q7#e:u}g2ՃK_X?,Zk \=,J *2\=%m(vJp'WW;z(fzp4m?xBpEw2pEF*pEHwpEQ̰1\= !O(`gN( OGϮ(J9•Bp8+ WS+Vc+s+k0MW89vc+D< WթE'92\=Rd{ɥm/z3/V_\~Qt3|AGWvy#/ bl~_pcэ9B MCŠ΃I 1>СNעOJYw0mY`i#k RrN@쵽: bhls-ʾ'rtlqilp^urʒ3zhQJ*t'@^ ;4OgvQ[#ksߴMBz+ySIn^z`ZG!/1^,g ځLƿ?OhQ׳wr t8hbBLδ93i1:Due ٥֦I2duC(^H_ldu |uJ;_S5UDT,#eWpr>V@])Sgeu5骖kS.Ф"S9伱c7o-lWAdlf#u%1M.jӫL=؟cGo]%>b]Wt_ 7)]{:ik=ɨ^H^H^l}y{w׆t|ȁ\ Z{2 uٴNJߋ,>%|\WKGN҄2@mFDtY:K=hJSmI!f2WB鲭{a0Os@mDpZ· x'QUt0,yrC۴ャ#Vѻ`<v5oq) wf;Lx{\WԾb< >rTތG1F˚(\m 6FMk3~$:;Yi" s̅P kYj$ߪN')I 552N8W 'ˌb 5c៊Wѽm.=ݛu1C}u,㍿ջ|w_*=" E$C.E]b%HMrwm$ɝU?_pp[6ƢeRPځ{C5$E7)J,VOLU9U: $"R[ZC!{> &mJmpdv̺,%p:8=6gqb׮zma#ݦ$ @%LBuR$&PrrHs*wBN?J*Đ bVA&C&&HM|c*@:U8aGiTxjqGT==ie\$I5f!l="mx#'%8Ϙ# G$H3 nTi2,$4b!$7B9U#VgG|U#ŶbմP+E=/nxYp9YK/B&,iYтM6)'J:}R~~\a58<=W]O3*/{oʁyuze|#cq-N+vd^+2 Wi]Ipc e?: ?Vf?O18iuPDByhJ7 ^r(G| i*Cʒ[gN(N "y+gJ;`JA.=֊V[kzE=79'Z `NZG@) : z'y|td=6@;R0[:QbfUGA FPB0sΪ-T@>t|GJdfU;LHA'9I)C̕dRth%A*TJWΕct.{Lc e 9ą`Q zRcʕHXï&ΞΛ*v艇r$SFHO~UuXCR0Ξ;gKfK9,;]t̉v%2uDQV/(g D5)h#,cZ͜gD *>XMHRiy4s2ȌA.%9Ak֣O.p1ydI3Pdn/RXD/]ٔj\aNh4}7YٖH3eu5>;8 LVl3dOClU_5D`JZ r`cu8};Ԃ;z8{J`n[xf ̽P`N;r |$mVJbţ}(zySJ!lA4l4ls#:DY ht.؜L)odؖ\BUxv]{O}s i6Cc:,{jrOOEl :{~ KXQp!84J $BBu {ٓ_;m*ݡV NU'&;~$ߴ@UtIk1k):!%ùG(8k淳f#_)i j4r7]fGCUc-VݹTEx~-Ӟ,TɁ%w.kϩrhLo}5ۧ.~dn) =_f5z8gIm}݄kfkn{|?#fތFlՇ|e۳Fy߫n'1q.c8Aߍޒ s_7g{~)OJ@oB#m/i#kuR&WI+_M'"bIHI'lVN= g3Wa aY((sVBV[&&4;3!bu)ޣbrB^X娕根`WQ1N;6#뜺@j^*@v吏èCY^s $ 6T#{Tenm,Ǹ<*:DUj'ou*H UbCN=7d$4 ' &[Jb6*VZ~5qv[8 q6slw߽Z]mc a,;WLô].\t&\T)ɛȚp`=H!D!)<; gD!,(LARdfdWX5b gD¢ Eus\E*Bq]* YC!U#xPi^;D/_K=&9A;-?]<:vKb<[]ݩaliU؎0aC bFBx*du0ٙ^_zD[ hMOk4Xɨ}9oHR02p!WVA;—! DCц\Ug7{@^-m%ͼߖg-=WGkjXp=^;("8B>RաOܒRTJnAjhkIݠk]0cuceswv&2LΣ92x)K#hM` CO<('m# C7<9@vxh-ڦ<5R[QU>'!! ALaG=x)O]|3e2ql,Ӷ%O0)9G#q6v8;u,Φ# ed^Y C1f&JxEp&d֍UvM'.oXU\o9GvحӥVgL U;<]^Nnf݊[ûɯ2pۛ\eCv kzGp@zNT6R(qYqf Cb) Ev^P2cn7/8,kvJY~dygU?ϧ8Tܔm66D5[2aY"$N̫\#TE{5^1.˓J(NIhSiǣ2yZ1V%6ÈoQ<pJo[Breg|dXak;tjD+\"&ixp\J)jmȪ:CEYZgX^| 5#9K4Hǜ4Q br ʑ+=GH!:.M pUf肞r/9^KQh4FrJ,"3d"iz#q{)ˁc6"ȌWhZ p(`hPEw JR '^ E*qLR,xɓrc]I.B$9"O?wt`owp?re2ǿ,nM8yF~Iךy3ׅ_[1%0!5~[4?Ʈ,}zi/};+O|Փ11=>ke?yA'6Y}f&}K_ 7mL]"b__üԽγ0z->0w ~#3ܤc3yW*/gb^;{oC$(q0iϯ$Kgث~ii47dce:+=\vTm4l1ǻӵlmVY068<~3,CJMlMQBk_ٺS7A5"cdOT&W2gNӖ$}2a 0 1R%PޞYF~{A*lX OLVh NH@94Y)]o7WKKd`\ a) ft)p{=ƞGӽƮ"ɪbh1"KޠZK祒ƥ!"!ՍkG[9uԕݚlse5]=tyEswvuva繪ǧ~&Wz RZ4_+nkY|[ݭALf^Ʀ GUhȋk7ys)ҏr|i1~}r{v +Z|sn(۽nH/dU54ҥg͐}bmN7}3*KMt OaB+x紑{O@YjaiRШ?|4/`QUs޹M{JT/z*nՙ$b,+\f]xN+nz;u53zMOLGՉNJI@'P vZ4LR#^9S0{'N-n?v;Zc9sΉȸٳ 4-3m04OBS+9s45J!H뀵@i2Z#rR^$D>Od`4$&=:Gp-ϙ[9s9 . o42IqzjS_.}yՒ:oRDsӟ/J3^>2(.>?Ul I9P*)xVjnENd}454xгBJ'T;mog@&HCJ)zCt @ձL1Pq jO0Ik˄ 4 DF)sVgYS5/9}FPnK@Ht*LxJ %SBext9Ѿ$2ptD` *Ц %;=Q bZkY,W*LR.Qfˤi ACIʔs՚n.y0OZ(o#<R&Vb;aBgHS$mK-bQi5G+ѩHdbHڍD`lw:ô%!Yһu.C2$fEa&,OɅ-G$a.oѓ΄%"*_a@lzp4y2LeD9$xhdbqCRXŠSW5 vvR.Nf;G:/X.]/%Jn0)6փMѢ%4]ȹdyu2`j֞^'YEg @;([a9t\BV,H]&PBr6ș VF-ԮU|ҀԚ955 ol@3^M5 d>js,bVi[pLj`.X`g'+ᛷ4l([>FZťACIJ ZJ"E9I s4SKNpq;!S(^v\\m**RڀƠ6f`MɀC)BRB Шm6FJ4JӐn_ܿ9Z 1Dd"cRa4Ѐ4D7<^iiZCZA123 ) ?rluNjTm􏅧N=(&?^8u1KM5lJ9]'by)YCE&W.äm豫a~dҿy `YTPQiϤcnh2P2f<\08bgZ E8vI'7Dֳ{R$GZGh=wtKRMmp`wp'j;:k+MsaJ*9פl]>UW!Y+Q˾5^D'7W'˗j^:uzQՀ Ē\= !AHV4{>^kgƷ7g^.gOQK(_wm !Qx[|t-"^ c+J¥j?, %]*Vfe1&qQi2G =+sKR\Uꪺre)HR'e;zH)X:O<\O ?x_>>:ݙ⣊dGU8zO_~x^*?|ݷ޼`޽wo^KΟnRx+w<*?^vdU47/Z\hfg FY>R]} yX oGI(s>͗Ur7jNɦ FWfMp'U}Av*ԝP!b;CD4,Dܷ&6'ΓINEϘm \ÛybYp mRS W""yv8);N?}e"%Ga)3Q@ƩI&ORIa]uN&$۬>LAKf^`*z:j5tg)ܝdSzf 6Y.,!o.r !=6r3$QX[-ƀ::_@_]Kꍿ]t}dA7kD|6,e857ZZ7"irAm[&fpZ׃9l?<@4>*~W<%?_i.+G:a 灡'ĸY-YDlT(Ư6]!@OtwY/~zT}|F =WFrKLq\UߦFnx}8i:00_ |_vZ#fj9y9XNs@5_&wC|tkϛOu(AWGs(W.2SF9>x%RV)lr\+lCBWB[Y)hEA%!R޲58_=Ӄ:}|E1ae^ }HulZIlf 5B;sno/cK\S,f(ivq`PzV`y^i i',ԇF2se4p](|]LO0:;-貌&&ʭч>ɒ2&-8NKDƉ1ĖLIz&- ZGH:vQE]{H>am-Vv =QKΟmuN'/&rʆZʳ6Vi$b3Fd^D#i0,5f+6 ܹf-F~31gI|_%BR <l4)&f.@ єݼW϶g8_Rm˹g>K򆩧.iڇ=^Ǐ/.nyt8XG{}!` =Vq3!ZĽ!J-:u"i*NUܦ%GtE5Wt_誠Eu*( OL ܯjDW-UA@WHWBYGtU[" tE(҈5jD*pUo&<]J|+RGtE5d猪猪f+Urɺ"ZUޠ誠Uu*(Aҕΰ9}nQ?+Jʇ&1541KB<GA$Py0OtD]rdo <TRL`䳬ޓ_75/7,d& Mw~G^ZA52:Ȥ< =i JR8r1ͰVcfZςNKDM"A+&O8eWF_yMu=>.gȚ󚙊iyK(TX'*͚WA4"wn((09Di] \BWqNW",WY+ Z+NWrtz~,MV ``fpqtZEWct7+>ն]ϭT{DW+9 ]Bu J):@k]`c]\zCW]+B  `{CW]5]@WHW!>YWX*p ]󾫂tutV=x69 ZyR] ]i@i2HW5]7ְD6 &]r^;KT/"dʝ^Ps@iAmPW*VF4[uEzwh!"by ;HLYx8ޒKdve\5=\ ͬBUu1u9@Ի 8OniqQt‹#O/,̂<( V Qr΢[nyDP=ZsMy._ 6}^ǠtUPr;eF0 }O7C:o]Z tut%{DWR+kt3NW] ]`GtUB ]J Z D6L0F+dA:@R\^ ]BWġtUP:D8@fvS0în @":`? b1 ZF?{Wȑ/]B֋;],,2&0ZYL&A-,z 0~d=b<D^Bt=!+7PB%v`ѧ!&E wG ֢iBk1e9*s4Q WIty1Z٫|=t;^2 Ep}Bk䞲Pҙ^zNk]`&b S+FQ*ZAU2bje0ҕ6wE, n:E;]1Jc3] ]qm.]1`]\'U*th5NWU+?Qb+Njb+F2]"]B'3HpT2FvBWV*Y{ -ț4\ƻMb mw9Pj:xM6vVAJ7P:(-ib(wpeM))Z1`Y\-9h JQB<J9S3`g+ "89?(d:@KɻN'Np1+Fwz :jVfGz_ B]D]A]A^z +l~$= ;]1؎g ])U+ Z%CW ձDh]BtELǻbZBWLt(1{WHW(bW ؛d*whUAFALW;+RGH}2tEpJƻbjR) 2NZte*JvehIALFosI [K *KI)L d*~5(]%(҉" J ʯOWEWCWQ8aNW!}{WApaQ EGV  3]Th9+- ַ!b4JH#+&CW Tjac+F2]"]iR+ӡ+ahelLW;+Hi U,-F]1JcWHWDO*2b ]1Z=]J'L+hFt+U*tEhaԘʯ_ "LL v[9]ڽO;clb;ݶMTv*M̴Uk.;A&e6$[JNVgl 0^pmZѐ-D:+ 0Z~{Q/> LB:Mje_wEB4MAshxbVNӌ҉LH^YӺtŀH<6o1J̇^]w%FP7PtAKJY&@W.KU/52"m2tEpم4* LW;+G1!b.}/j;]1J4(emBtEE/ eC+:%" ]dtE(љPi&!"Zb2tp)ק+Fi]C+Z]1བྷS/AhAϑcC+r[V\UM&׽?{s&`Z ,DC=+IM78+S|<~-ok;4/l _㋺G4cmYSBz}{Q4"'9x_Wdz7/X-JRL% AFIS}S?Ut_m3V^wGπɄ+՚oZ#5Rk7a/g5^U_Ng7%*BfrA?lƳ94\ޢ*O=4yެGGx(lUeY8,kolt;u2[ 'ԨZy[W#Ӥ__.&wy6ΊrvMr@Jw9/nlg5vDNf ֽH`o$t~};YlD{_X8qn^0}gzK.#Wk~b-[rh /s!V^VH 6f $w5funh=m f57&\JH6ppc(GǓ/zV%p yBIh+XkMj'/aw2  @Bx2b17tnBQR|,fOQ{%%MNWbTbv69HcOl]=D!BkZ//i*4F" eio| SRb][,I ^Q0H Rmr$I 򍊛_f<#N_MdkQڋf_cuj.PDv"%8!#.dY&05A4+&-[א>cӗ) ^٤~#.6Vru)>Nf1ţ+e}0/ kqwO|V{T JxΣxTQ: v4șrMtK$Zo؂c [}Ϙ!PڪgMjq1)+PYm)#rNp{OEM94S͗{Tϣj!X3pI5>p/:>ZfcZfs$|p֗v[W{oZU{-Y-Hy~DMySٽ:N?W5ڒwDM(6<6V͍<[mYsNHrc+S7oGW US?]͏U]X#k iEt^Ƭ ɨ#dNh'5:e?<ǷkJyMip4N{ZZXKYZO1r"uŢ7Yeͱ)ks׼,n3D'%椬@v4H.j'qvr?'ɒ`^hlQ! E5ӊP#mSrȆRh9h,RMocI?4"^jRGC\F8no~7cCfM9=c>)6-&qĔ 1~Aom,骃WXzCcMEGPrU~w=JMO7Svuˊ ڭcE„ʢVwF.#s7eA-@r*tfqc7qKLL@q/Abs0] a"wjM /,#4c/x3A2D% 2(賚i(j:ijj;m"]ݡ L漍8<׌I:A뵉z)bZl)z6AJ$+9gShCKFvoH@Y;fgmH:k$w"I?D9C t^)7Yv5;Ɇ8̋}V옂M dQv5H~Cjі0RqZ4 f[sbl+T+Y;f^%091Sʪ])saζdfgǮ_Z9 -H:FcYSY;`&J`>K̛.mq_[< fa`/gرt:_(9$e9`ݑu/2CEezEp>keD8|8Kg9ֲ-4T;^uyc~u .$aHFD^p,aiy#KW~;3|BDЇw+Ԍa_hUZB*+[ޣFUEhVr&::R Y)sBI-NJo^O?ufRڟil|4֮*JSaܺ2[ؤtgnYSDPMvO[J 3CxFJU~/!^ V=<ɱ[A=$w˳dg3\1 º+:—N"2$YڽL/'XĤ=\G5*ʆ&yifӚ qjȖU {>6gxlu RxЇym5ln`zԊ|u7)=Ћe9Z.g SY`Za)hM/Gb#V{㥇vg}#!r/rј/3$|,zSV_,΀;K&h[C%oo%mF^~H$A{q^/qyzq>Tf.g}g(#]{= t.+kCnzM-J9D ,*Ky(5lEV45"YVpM &2 J|k2FQËqhAnO9 ŦJA]>l=#D_aG]^*,3r^TrLx6v3z^x3~9~h{"&w; Yʋf5k2ӯzn[4ktO"z֣ YB@u-=FVcЪ@Kcm{?3[LmI@ "8 DŧZh-;P1H j#oxAW5ZB嘪 :')o(٢Q7 _?b~9n:5 jj=Mowz1_ivkqX>@ȕA7~,U] cjj܊Zn%7uuBHߵ^Zwu|8y[DXl1}Z8(&\[%l_4T;Ce'q }Ǜ Z*B")P 7e.e3r3Kgkd{^u96a0耬&˰>.+SuT Ъ#{bYB2'r >@EQ |U=*e˕KS%u*J8?t$Un#B fu1 @&@IP$ @P39 h!RZZۀS,a"W>tA'z] $'6 rxSקް2- $aR&78. Ig%r4{yK&nEzÃ2.HqLBפon2 (eicH{R2LOGQRK3o;!+B쾸^e,]⩨ ' XGZԺ$UB~8FJpZ 8AR!ӐYA=+بS&OE\Oi`+h2aE,z0,׆O'`Ʒ7A ;+F`d7x\(iڹlNt?A14BOrt@w!,;T zb $;{Xft7{>.o?8?-6!H : ۠Ž ALm9y@ՎF =黛%8}8>6,/(q1oӡ$ i$V=8BpGY29\k# CTŪ$ODڪJE@DuGzZx%t΍`I;91KHSq1iJ&5fXoRc r;:(segQnk9@J5B;Oo\o}tywP_1yZ睘 򺊡'=|ZRW2C &mX+]HȑOw./09iͼY?ZcUR}2 jJxOh^7yxF--֎o B A_=}a'ƈH{D4TXnAp/5~`+x+ d7[=`19Iu5$$3N4]d? ه h_6:_8O>'Q A6{|Es$(<ze:To7m @5jZV ]h4,8H0Hݯ8Q={%=[3 :~DɑeFG2D؎d9\@H"Ÿmӄ$?p_•e}7fFGks$^V(WzNpM `SHvj,=Dq.n O/?ݎ dnslr^{R qEk>`´})VEBn GwrфcOaȮ}R}BE=>Rּ#5T؆8Yn5O߮R,y{[f1׮N g&a~2:Pw!!׸[ (a JyGUifC@d eNi[,4tvvzvk鲋Xǯ/Ui~Z l}X?c/?׿nɦ_IP_gQf 2dil]YY, m/@ (ezӉ<ɸkdxd"+uF+D gvKs- 1BJt{Dn.?H#WxRf}VbleI|$)s)a3 H kZIgnQ/yd>OpJ?j1Qsgd-$h6-Whn"Rln61ւ.NN奟T1KZKLcCt0ɩ4{s":+Om!&&]]OYfEyewud9~z1E)re` /sJ-(11QZ䅴-ۚa.cAYre7 ozUY Xヘ?NtO%p':LąHB:] f+z^c_(3ȓcS4?n%|zo 1-Wd 0n5lԱD(5r?]Sp2UT{*'8 ={)Ț]s*ڳE֜4$=<"Az$&L.$?{H^LBXDqVVU)KX\=lbO-xH{hR`jeEir^dH)cqM)F߿o?R+OU.˽IFFFuC3rqrc|Ly~h<(:vzNMT;?3NSӔkJ'ױgzJ8]di:ळ4u'?htϛ엁|'<.WcnΜٱ#fgq,d::kSO ܜا =gM($.i œI+eH(Rb/d,fo'X"jJD|-7z+w2^%X2ugT'7ϭa5>2"c%M뽪cnwdM1[C=wy[_K-" j 9 8Z2>1E hc&C||;˂&.Fha_@RB· RSOwSplu/M+X57AܨteTU \+n[,ĜD<T347mba)cf\_ߘ#! ԉU B}{ A1V?voS-|u͡u>hz}ywr!s+EnaȅF"ۢ6<]]kLec* Iq~1uKwps! S!p0ۓ4?̤|e6ܧzrt1_ԑx%0D}s/lHR`$c32k`Z$QFpɵ QV4gfe/4J*O'5x4a4e 0REDC[(I<3jwy_LTth)A3 JU~9Zbzݴߺd⅐?\<<߂@ڕ8! %`7# ANA(0 xN"#(#RFS[ڔֿ#4^p&Aϙy\@ fJHe #CC cdvUTHL f;/crѺQ`Z?c2\=HFS9Uɢ@ɰA)q&  G OAU:Pů+m%epm_V~$&DPU7Сכ؂V"a鬒Z렁cQtEc|˂=(sTYCLR}^!Rxh;>b]< ߂\Ese W *l!9zFqqc=H#:50^IVJĨ6nx(=6X2tI:r-ڳ 0 '=Kl0rF:d 3)_PC kpDh[VYmwT孀KJc3Z[8G,SMjD!@VX*Rh|=-L~"}j]x~J=`ޗӱ͖__!ՓICCQ<Yi泊Y?Cgo=ttwx̱ld V٤TϽxR+jh(FS[y,)s|12̌{1a|zǯ!7z|=UvpmbPXC1$2"=Ƽڮ'£2:I%,~@BQhZJ/uTq>,y,jR4udP YCׂDjבt< mis.*ReiZ8D*V3[c㐴ZM"ŭ;D8*S7_nR7Sx#Sfpn9ỏQVCFJ2OcӬZild1ʘ3d/d华zwEc7c3.!#"ַ0$ba_sT_,L|pH1V=]{"AHl’BwtƌP<WY0hƃ0vO-ez;d _PXRk&ŚW _͍vg<{wY(ܪÇgm0r=P;b W%ufJ.@ᛠ#kp@kvC;lWyf$$&gxLl=xX)MMG~ jN\=5Z}¤ aJ.X/oD>.JDpб:_ j11FsPb֟ K#(ĠHA%'މ]'yLzSO+)~ۧNCOi{G-xgzGixVa<.>N+b2F8Zkw^f_e<{d 8bCL^k+-],T֞qUҾנN>4!դ\nbq).X{aGCyä%jQH}vZjÅ_W\hYi1sY5tDԪ3{Nf *jPd2_~Ǡ$zk?rgewn:37@@O'\l{?5_+3=h'eSDjIAq(m[$1Q8C9?ߞ@` Ѷ pkXIu3lhd xӿ==Q#A`'k1e+K]}2H@rZfT~/ӱY:#H5ZDٙ W$xyEvȩ9yΌ cJ?ʘWM9јؔ{V8H93`cso.rK#xKG@+57A]ӀmF,UwbͷF"|搘:#;U2oOǴʗUVL{.?`dN_dwY2 J J4B4"뵩fUTO]Dce~:=bcZU]WS((DhY,H sQɛNb]\Lз䜱Gd ,W2iC bSPr6)&6lP>؝I=)(a0 kAqSr BY5XLkMU5+$XUU?d|._eVn&@jinv3t6- T+S]VO<l@bNev6oL>3lgF)bdOԆݤCƘ(9b2u)MI.k܋2q[(`` 0@P=Hb& f@qPgQT~`ߖ!eӕW⛷ f ܭ*W$ M/+hH횷z]ܺ4 6wejBaѨÃ9(1TR w2(3}*dg>,TQ2r{.JuB[&`@X05]{~{tM'דż.Vbװ)9Rbg\%KO[I2t)ݭ&d Ee&^c[yԧl!ӌΎi}IaZ_5c~SMlܵzK0MuN#bݼϑQ"B&2 jYZ C րz{6gC[Z_~Ŀ:a׭ cIdRf]jK\]ҮZ4&YOg֛,_NT $o<6g1CplNF/xܔ 5'^/w˼g3ܷyRRoΈ=nj@\q 4EY1v(s!akuh[e*N~S[I [qT{گy%TDʼndllCp}kRAmQo^#(|cTk)kn_P5sk+,bXI"^PsXB(k96UN)-ۣ+WH8绍lexKRQ]5E!";G]@asϽ᥿&M]G<|1Dd%tY8 y\Iecvd|׶k͙T%42K_$ Y$=G{{A8\h+a:^`wi*xyx^I5~g Bܒs O=Vc,[T_T۶*sUG i7YO^{Qݷ=r3z)ה:|#y_v>SBBszL)n6H,{qw1";l$_Ã`Hu8 j0Yv'qKؑz5 8Aåmw){6U.i. f'gX0%&_Aol<Ϛ]pԡ ,Q5?mS,q&+MzuCB&=bYJ3PBҀ)or)0k8H,4Kmܠ(A8BYsiE#%1 i4$J\G z iYŧ3~pTi"[7C'gIfX`1Dmvh1L~ nJq]b%sO\Q |/!(J,ډ `$^/a Fg|y@Ň=Qӆ6`H ʂDj D#]v*!OTP`1trV PpEҊb/^b! ߞ/>>~s^<ObDI:dݛ 9*s+avV`\oiEk?}XB*V&Ja:e͂V|5L+P/ lj9tq8=$(K@,%`00@ELl-L$v0?pNq9oY│?*w؝)y< lQ6gdUa)P#e9 ΒK}Ԋ_is`A61 vߥ^ݜ *ږ1w2C e^ڨCV Y\0/l dPR3 UƉB52RbXMK?~_˨ꂊD8zH^/q:F>D $VL_Z̨QCE$S}*&4 2@oI#O|g΀ 1l| 붰VHVJĨo׀Z%) C(K%p^?]hk\5QO[xsFfaJ̅"4-xn(0d3@Þ:Jucc)PX?Ǘs,CƀC\*_5@R_ DFd W0[eTx]3{d4>ַ\E:d"$0Arr1L 2ץ!naՏtu=?NY`*%L[gciP3R\ʀHFϖEQLp"&"ۀ( f%fiv9AT$V#TEJ?Tis.O0P I4N/RdS5M-ry$*D#7@r㲷{{㓘7N04ˋ>&o;cĩ4=A>`d1f;A. zV˨pwfc[=7a?_Ʊwz9[M" [B[p/ zgukerպbuX4`o_xŇԠ岘V~kY;w°*9_zN1=S/K,&څR+!汣sE$Q.oZn>ED8?;{eRq㨝? cSBΊq*]/LL34`*o|c\wYXRl5Dz"QTwiދ)}K1}Gb?)aQv,Q屸f+" .dtX/E\N%{7vSP!fH@DkeJxmU8)Pt&zaJ#ܙmrߊw&Ul?uP[Fjv9;}tSV,amn0_[Q). TfNIi yo5dXș 0Gi`Ʀ=т}sGD_{s:Q+&'KD$"Ə-"1&{D$Jh'|?,YJ·>+_e~o$`S0YtK`pEFFd2o4Kz+:=nI++q 2JmEdQ[ W;0-1d7Kԓ}86|W۸t=eYYO^=۳ a\nƫ#!G=!N_Q_Frk嬹jK45ƪu[}ں rn uC ̖Z9m[9}B4uh0cZ="ZȣqN{ (bj 6K8%G=þqxTwfNqB6UeOegt"0 n>k7w(V)ZYu;e̩V0!S܂t~e^Tʭ/nk5&s^ԙ/20Y_Z̐[+ρnH=C#IjK/oGn@ $;~pU6e#N;uumf2h"8F^d4JjKQ=>\ZM(%eu{BA0FuMU>1BzhRݛ3)aZ`/Hqk dIESir&;6zKnAm!tG7)uy5Խe5+[Xs-T“`jFrpG.+w3G7_;#L=Z,F ..|{z752 go$94fg#'1?J֌w=-} fn{ᝳ^RJcشOvyu`FV=i(@iJq+gR0aNo7-md\G#êo=^8np0 ?FVԦ ~dLÃMo6Db{/NJ:5>u#pX& 5e) mIJ7 nzaM4nU,* f L%~ZSY S$zW!"k8 .b &yÌŠA"Fmy,|<UjmW aiWb8Dً3Hx" 9ơ0^Nڨ`HU%J9bs=Cm@(jP%6KKD<[)3  gKJVX2JJ9tax^PMzM[ r.2Ԡ+ ]Tk(KKlW0^oXׇmg90_cIuY@iտNSc q))Ȗ:R2oVX,]6/y\Vr`~fas«vH8SOBAxC6b $0&Nxܦ0!-͐UiWAw3ƷR?>7}67|&h!;-ޯ׼ty)Yrל|ʥ@ ?܉4 r"f |)0]Ũ$lƤrЀzTJD;Bq|> S27KύSDȦ!ɜiAs2n(hX<7)-WFqI@WK`Gk7 㺒?~*r 燞UӧbR/{Bc '#zWWQߎUm'*FrV"!'ajES35 GjY@ƖkSșX2%bs^ʆ!RF/Puu&{׮VCi@FmO*Fڴ~A>Ȼ_43ivmӴ"ʃI [̓^l낎uVH.>J9jW}#C!mNu] .k" 5õj={ct <]ӪDigF,b ђl3)83TKrseGk 6 ~MVKJ-4AF r6\j>w݅) '[R ],ud\ӑ|R1H6Ѯ7Òж~!Hr&ՠ Ä};.յLF5U9s/V= A]2m&llֿ hx;,'2X̂b)HPA1pW[/yy 7꾉p'8%?{kٴjwQ4btoGp P4msr[t08礐^iojJt\'ր2056D{EEYЕ <˰d6Ϊkg_UG$ʫWK+@lN ?޶O*t|"( #OK U,PRX% R|&4ghP?8Љ] q+THuTs Ffm_R/.5ڄ󥡄Fky%N 0)c*{9]^ ][s7*Pܶh/4ZŒic;ځqwvϋx{O鿟c~d"w3]7ۧhi6hi6V A QŌ7^ JKpj1t.H"hQ_ X?'l>=cqi0 ^HdES@ y18ch+9,F3:/:1͙^P. W[V+`5-m~Cѫwl$ӊs緮Ʃ3 AJĂ#2ԣ0HT6s8sN`й–ﵰ8X 4Ѷ; AW V E0uհ'rYm==H|Óv7#l|RSL2V(+X" **NaF*D8vk=sϞ2jl%!ĨL˅(X\r FihOMpuOs{ o'hp f'-pc"0&l;_0GF LI"$g*$*dT1-1e"FGf5`RUoxKs/Y725g#z6RTi:):=ؒIaڶ'e,|xLOD],<8#,wFA+Р{ 6*ky n"VrjV #I\.rff5fT->%st&N3@R=90qhA6}624goexfA%a(Lj܈Pzct^p^ 5X FFb-Eߜ+`& 7lb͓@U;"ARmN*׭.BР]XfĔ iC0Gb cc_ ]sZ=9`0\CIPQm߈Bhfz+hN%k#Ur=qE!Q)@fxM%ƲO);/o}jYhK*MsЄbJ" H/sd3ݴ9s՛Axp' {i{B5A `ȬrS1'iM4FDRs `# Ah QBS8A+ 7:NUo.Z}7Ig+:Ç_GT&sXJѺഁ\ lM-0׻#m?5\үFE-XĄ!Gi5e@s"l)Jb(*(RhV0h=Ft$쎎۝0t̔) q+Z"LJa]Ik4:/Ux4bNHhH8kK;{O} v))_RոfѴײ>SUlH}>^Sϱ}gJ]רe" zz&dmeʻMt{Udh`YGx=eQrwt{F!NPm0)v؞Qu[Vk~x IRZUSZ%6"JZKhd,J⨱Sv4N?}`6|`b6!i#f_2Q̔O 3lfYbrZa1^(ir˄*oܤ=7Ml`ZpX|ԡ<.;2 :P(uW~ƘRe < b(ڲs? uqQ+麪L-^rh$Gg!xqN`6dIJ0|tn8u\qN<5}鍓SqU}C $dD^;/ep>~32OcNӘ44);)V8w,a<JfwSԘ))4g4(_)ٝ)cFyrŖMgl{.W̴ڶM.ڟRT/}rN髈6z㦯.F$ ue%#o7#I^fؔ3`~yxlռnG&UEFeLD W-NJ[ N߲?i]vVE5*nuST<_9́'Z -/ҥv;N_8HfC`%8 ,xLKVs+3 эC';xѽOhպVVOFM}Ate rw8=HxMsJ` %sihdz'ȴZ%G?#3+zAl&NiEAxƯym>ܼy 13mnn"c+;RK|6.ivaty&;>0N~rsFʐ0-]L*f(dpR1/貳#oXjA! F?TZFfFtQH炑BMR7ސZ[.~srqX`qx(w]h>QU;C夓[YNioW-. TGPmJ6Z66TNn]}.ꦛ\cVgsBLQk[x޴/|kSm燥 r44'\.&Pt3ً/CZ d aYƠQʼN|Bov g'˧v.Tzz @ Z9vYU!fu@:× ?6"oM] |xKNm˻.ko4\M{JnU'{_yutS$/4 K3d/"8!ly9Q߾l7B ܇O5ǃ:lk(BR>`UB6L}1.+&;`1rvt üϜiKl [jz|:>X ڵ!:8&%v{6:^އ>d9Y<Y0B=>rIÆ9 lNO?>Y'`"a~G:h'~ ׬KrQKb:R p #Qc Uq!N= bRmzm,54ހE]&Svt]k$A[E@Hb>ئ s#-ի|{N9c/٪7zS bWo(Q"me_6tlLmƟsPRoZ:P&R5DR@a>:0 Tr:{[v͇.ל[:ܼѬ{Kf%`VujIWFv7%nwj[kAȚ-"{IqD)~\⦑ Z08eQŒb&ȱ4St.54^jϧήrMT}"b-i$/BPA;Ax0:9xEާe!rzG`)HA- =`T"{(" kh܂x#hع}/A?%=@Mu_0|_ŦT[7Y s2>zfuAp 7$=tSXuT:0&In~čvzt $y1ο?x]5ŬY߳ܦ4p.<|28l< ۼ#2n?xNI8kc;R=_  q@\U?;v3Z!G{Zb7=kGf1[]{xʢXF[A:NL\D7Mm\MqKជsqZn h-Z(3qnvVHr JnHlQͤjͯd-l9NXtOR8f?|Dao}Fu A(`38>Zcb>55+jyrB=ގj#+Hjdk8Q8-o|+-quv MޮXO;jΌ8AxqL3I`ˁ4`BE6BE ɑWH'WrԃkBNjbj?B:rWC,Y[:o +X#S8$):˯[!@p5WOVtȎpLstz`h#gqR.F{s5\J!﮸uS_[L@P9^g1Sk Ͳ{4Wvd_i,<5`6ڲ?.ȏ_Jp9oEY1OK?4~i폳ZI8=HqOUү$)ztZXK.%`fN\+]].x +Xf7oϏWZ.LY_8uIƐJ2]9/h.;aL|w(Ҽ w(O EIJĞ^ =r#@hREk5*Kx642'T>;ڼ5/NcHNhCgJhw %@i4Ç.q8!Ş$@g{ZjIZJ@_Ot \RE%y 2dEE:Y"C 7pU;v z㭔֣7YN|aBta_C%ʈV@bj-^Kɚ8mcw=4 2-Ikz>? DR=:mq/yS$DglIvNus8(GCJӁGT OGC tvx%4=ܡs=H8oK( V=]$𜲓Æz>ݬ&)Ajkk˜rjG(+w %4;X5mpfjiW$=,N t0mv|k\|i;j4\ک"ٝ珏[»s/=[93 #Bx2! }8^odtɱ'׀cgCNU8={:2oUHK4~QJsd*TW]uխ}Z9Vx"~_?^UDVj8L_s¨Br;wWww" ?ta.+ 41)5o_1g?M!sA+/!I;rG`\ayB׺N5F!zt Y5^VvNmAZ7pRPx>àەe[a .U 9 Y]D`aQ+1 7NFm1NgvFU2L!I5N}hxst0)l-Y}S-scI!ֺxn!^er tx}$FA%k)e2l=Mad`dcq@ddiQ ^D0GWFg0#=@A٤31$ko/Q3!|EkaqYM\FEF+)%;yfmĔ8)y3h8dYIBBA.d((H|LvE[cvQv^vgb _m~&SfYd|2㥈N)znq"$턔&[\lLV8Z4>+BF)ɔ h!:$<NO|X#׃,ʿJ]YoI+À}X=gmYVTۿ~#)J").(C0lI,EFF@S k@_ld ƊxnQoG֘IU|)DZ7q5r_H;4$T1l:hhgfjEEreE,{'+ SrǴۺ+3y4ˊF^h\Yȋ< x9.;yTÖ6(դ5| 54=6#@]Vɠ98MY0A ;j+X ue[V[DNeْ$SRB8뱂(rGvBn,ۯc֓74m=&o0߭k jI-^]n٫O.jD*&HW&4 Wql3ZB!@U,Z"T"R4:*G+h˴[ JL?NO>/QEbiOI0x}z<9x v{TtuA:#)Ӗ 1QNT <@!:Tc%[Ђĝ#TéZEb0 ditLY°DTMΣtQ[) RA@4-H4N'GBy U/WP^x#KGɒQgK$ =65uY$i_up7[1痽Q[z`ojϜ^_Փ׋-W/_|!+޽<|q_Nʛx~ȗ'-5/9ڮ=q\QJ˟Gߣ{ |ț*ty}}\9ZԖoϬ /s@s:>z9_:G{+y]^/T@"lկg0d6RuR!(Tb A釖PJMLh_=eH&AZ?l*m)RQbni-1A{*0ux2[KyaSybNlSTۿAM._g@;Tbdv< l,T2|tl/gv/UooSMZ燴\*:6$ܭ@ 3)<)y_ݢxWswz{ĩ}vM/=zȇ?y'%6þ}e}jX)׏u^ʧF-hjww|UyOr|zE*#^0q>83F{TJxnoZL z_&rwb/kY!6;5owQ_'GA,PǏmQQ}sTT c'z($da/>B.3""lU *4dz fm,gJ-EYYBYkuJ.>b%rmh 0/!\ha^lO]{0qȭX/|#t!HdB,piTc. m4^hT]؋v #TQjg^ Yf#ײͪU뛴1c$oC3,Gla$VE-$mxıW:⇻_ dkUꛩz̞|˪T}3mirDГMp}.s(^Ğ oͽIلE>Pلj}63#1Ovw5%7U>l9P@b 'zlY%%S$b9i%hI ԒKƏ0i% 0blDJC9fw9Ӹ$x60ǯjF־^K+ 1[bZ+dJkT7I<ҠBq1}8f`prlJ)!'lZ@ *'(GA"ƙTj 9FFV&T T$VSt-!8}})6J~_\{]]v;ĊR'gc-F[dʅѢ՚i tNp^P"k$2;T-^t8* 1yo'p|opIKzi};A<a!G!sl|k6 V;͋ڛޝOHȿ˘~{zg˾ɫQnƠG<.Ufn?<ͅcю HDr ?Q gD0x1np~(pf5BǟV_uy/'X/S)An*{XOϭ'qS. "Aw<7gd!zX @f=uN2F*D{ <_2%9o2L'RITN^hTt׎|1=$5qt^s/kMn[a`ˠl ?X)2f)ԿS7rQT Wq.⬞ᲓŹˋ=thRW7]6N޿뫎'bSnF||hmAu!np wSr]xD*FQm9\l'sϡ$+@Jqј#cvOs[n,]l ]xld2yL/,d#)1ⷠ9$_wTsA}=(Ң}ؒq##%@Jo>!.)BЍ"{A8g0!k9)r#we6m}lW *sA_*iKԆo̺W7PaWc5 . BT/6S|&Vm‘6S#H3t'+M)  FUl1EEiFa'b +!ZV*PM2ƈ0rsILS|i[tI~_(ۓ1k)JQTѣ+L b դ1I .)u ]k`׳6D~`?{WƱ\OI%+tO*Oɭ}|Zee}!pIZ"JUbgBIP@)N1g)`jXh3{N n!uuGŹK{>Ob|Ow/jc1&+%d A1)I0UvT(("ɣդ/ZO\[-%:0=J-D e4XY vpYwq9MuGV׻adtRqLH'\hxM~3TBh'EUQQqZ kBLe熛dfcpg)"22%qr7a9O j2Dy˴}q(*+`/~pEM>`RES /#$stV0e"F:Te*Fn1??8؄(iK QdTGpҹ+ !`O 9.V6ƒՍ!"woQ%Iܠ$<%i 751`NÊ~l%5ɡpJ_\zIuN&9râxeD:,QQwT{ rfѩQ;=vcLGmHr:70JGl3h<)/`\p-șkZr{զޞIb>CYZO(` 8$D2JA4ZdyLSU)'1< ^yj#_#^Q?}|CK$[#;Hmt'6:āS x'@IrI?`%\"N͗Z%Ep$]ϖޡݧ)?vz \3峿]I;kB s<%ⲑ?Sn:ylKշ~ؾ0.#9]vXo's?=9 ㇮n_?U[Y($-7tWoeRIh9$Y:1*&R0+4:Zᐮ4=z.wD*b_#7!;zܦ Cȭ9%5L%"HwKM Ƌr\JHۄq~)<< g"]p$2#;sejz@,8SxZgX^ rJSDqun7RS?x;.q-nы$)+" MG; %jwNrbա#b%xgU(JfOuz2țt _K/cp~#?[c>Wɓ3JS22j-yø*CYj/>!cT(\p]쟁YO>v&L)2Aq?) ?{ʲZu8Cx?(1ErAX!)ֽ3L.x ~q`ֳ*j%CGu۲0 p:;Z3TRZ ӟU5>bs;N͍?&FUw >_7!`j}Qy*))@[\P!{XKW],1+_Pn6SϝBQ .H,uZKw;'ԥ|L.m%K'}'^FoL: aISyCjMjM)`SYZ@p'@ VG,W% #4bo/19FV˕@x/1t ݧX/˜I۾MP!SE,JØDc1L*Xݵ:~KaD^bVN=uç>K$OBVҡ{5O-& |rçd-V:&"~g$ 2[7*[Jf,e_kjDHD`BZ!bےSfs_;\#l3BϝJ%P8 ޳uWw~CxMgZybEĤ*~ON6Igج՗HPpZz4 Jjh<&_W̆OSz]I0f?8Q-'Ŷ(l ]U]hhņ2s>(QfMz;<6ܴ;au3#x=_){ƹ9ZZM?~|^_<UzcӣtBb$bb5ɘw?LI,)[Y'E8)51:ҩaMFFŻjq ?x\ĥ_rX"]|'|_޽^ ̷u^ [$``mGIW ՎKݨԲCMa+V&}(pV­KhƽY]& X q=KeBCaĻ{Ϸ ;hsNL#NL_y~d 1FQl}6d%wɱۉ>Eyް[h]eRMws堇9,)WIeX8O{%3|<π +mF$@Ͻ>t쀟X5"aHAY0izߘxT,8sԡ  jyXa?hxwA7a -2_b!H7-!fQH4r~k9\|^K7D ]azH7tfy<+)BVdP"~As&v.0zy_Q`E??{G쏇Jma],k s5*NgԜ:˜B'DFn &hG'4rF}b E ;]y3Vv0$kޞ3K2u?ix+ovWhmB;F)A;'y)BW}D_*PBv: >WsRA=gBeTڭnVqv8iDX8 4,>% Vhw\+iJu;i{#gs,?\_KK{IΔokԠ~Ӷ-`|'_>s_}Y{oo嫯O5BFYS22oQjpyRT̽2Ϝ\V?IvO*ӢJ笃YZ ٱͫ+5tjkI?vYm a^%붆!nQoyQ,h5Uڷkv:ev9QD%pwmq9yܝFh`dy ݇A0ؖeIveIVuZ:ݧ[1K"?^Ra fh7ķf=vh'lYؑ8&C:]fhw`{0ЮbݪZM-g!xzFv3VondFv Av*!' ikgd7#[T{k!6= 3`S_>y>bJPZՠt M"4bCoZ }Zcs ROHt}4(! ɚSK!#?NeMPrRؓ }]q +bQIDդBkZkp DHy]@0/[%4 C&aJhLas.*mS_]7Pf*k % `R*O94y>[i\LD(K$00%NMhK.Y=!ZbAZ֊Bh`x(ɝzVg!*)#S>"hTv&#R1T,b9jN^llDG_-tvȜ'RIP\` P$B^=)ƥ2B::D/jm:8:8+E9jXL9*MHZ"ietII Id@T#$Xg#$0("m sI ` 98+RM)^ I91 u[J6ƫԲh59р5kCQ5r#V,ZD㫁Mc"h"k1$jbv[I*CJۡu+ 'sr$٭cgڎ̼*EuԮ}nǜ)`'{%5۾_Z"zt(xx-Fe.DrB_}:pSq.H\6Q~_C@#31-%[&Y-F<&⤇ qXQ\I jK7:#Ne(0mJ(R2FcxʢzoՇ7׉lK @*^n"f%q\tJ@hhr5Ù}Φ>f8`Z+rBe_&^e:\/geu'1 >I6Q;њ>ԛIt<] 2=+~qc_v3 ;UyԪ=F`r,4N)fBiY6Xّ$>aN"8?POF.sVW@ &M%/!98iXԦ ͎ʰ!5RS6ZsF>+17Xb9UCv{_%<|OS^,/tS2Sy]r1R\eѢ|= (cbEtj0A6ED9ReS#0ą'K؞i )PUDْ=E1BJp-V2QF7DX^;j\ :aO,:O#9'̄bj*hY,Va=D"a0pjlU.~fCETE)O>*Sјw IOfENa@A _Xy&-0N "-Q:QFZ#AUm78كMM[9M/-@b$il~ `>}Rn,dzC^rDQȜгT6Ȩ(rF֒ݛ] D4r_Bq٣ D! Y[N$ jh Y'*P/[eQ- '#,~I>0w'e ,<}u O? fī4te~zm@cr5vOW 9' #L9MuDix0ps4ÝŢ{J3XvUz\VCcȝܳts ?WX0×^R𥉉MRVxEr7 *dX.J 6 >H.Kⴧ9 -bblav2y6Xtv;`Vٺ" N+6V֋|*GgAUS)nVGoUU+|L6G4GZw]" V/0$GG8c)XqPQc<)&DIUUh |Sl[&^hZ%l<`GNmQ:$w kvO/1F;d nC"~~ MqdKBك)dCrs8NcaYY0Vfwbad{[̒{:׋x8Gs?Bznr^ٖ-|9ڒl !yoϩ!;M <rɓ9DgA5!M< ޙM7S77GV<|rόbV8"J扐HD7*[.y: ZSD+Xu[[dObo~Piͱ91߂nd5Aٟ`fӹ\tL)@qEzn[ ysOfh0E v01xOVa\d)hBmڲ;ӂ7G^.Fj05xPK&߱?fqxG ҀPm^NPLيgƒuhl!Bi֕ׄ@O.6Xsg9e{G|ӛ ,ҜEڬDaZr 3m*1[.x2%fl C7XA}k2р҇_#[%O1y-GC`akZr`O);hk\Mc>vx8{"`!hA꾨 iGz I!o9)-BzCgbh)'D#Ԗ |{20nr0`rhHzN tc,B@< v?W%7J<.^%ؿ\tJ5acO[KՒ{$p3ؿT ljpg_\'||{?Õn[c:l~'?d/~Yکebb7}]N#+0+yhi2jzK۵ ;ҹytH>ODfbf ȷ#ϓW}5 -aY$?O(QazTל\:xgO?6aEدX,6kyN9g񸤏_:E7r%.YBH}͇pؚG>^ AQK?usw./}1/"\"? ?~zZQjMw9n:1n:<ӯFaoF]k;N~3%_6v mxu^`z.݁Te ܁?߿G)Ea E/*ւ( u8C7sdA6f?ZD6xl#}Uٽ ^񞕏bmMZ(`ZhZc- | s镖gQ^ 3sV}*lȖ*_Eh"s=[8mGɮYs91zEg =ܷ8aӏ^gĨNwvr`)m4.V,H,uR/;.},0ɭ[xɆo?sLZw|1Hc~xIr2`%JnU^[Rrfc2)=ӎV{LcvTˊ<+O ZZ{i썇px4?=7kn b7A+o!ޅc~G`; :aCX`O\ǜTvV.M=2jzgE2=ԁ̇|G@pesne~de&dZ3Y Jgʛ&`dIW2fٲ==(l|E4W[P' 00{*~~vh~x_i~ a#Pz9-oHS/,b7qDA(I0MnYMs(i%ݪRv![߿9fa w2Om"Dy`5GnO SIA6{i/Eh \6@vd&96(ɞdq-nK-[-R d~c=dbTr.\xY޴Pr43dNgNygR2YVhv,_ MyLDy!I  S`2*= Suh^8y;tACUΡ,R%:hǾwϰ?k% o/WɟB?^ K@JH0ôș@kRЦu?L{,fE7Lotj%&3SM)Ro4Z;ffL^ʴ.\*"ǼB4BM-"U:"<6QBS="~Cx#P*oIj"w6s c3OӤ˲t-$Q8'v*Y{c?_/2 7ҀD)ihXT$䝵Ef/h ˻aW.vp8nL5`fEV$Y ,yG=vMrYᫎ/Z;D^r'BzӤ%&6*hN_r >MV(tc~HO]%e8F@J: Ш-mG3Lqz'3 NU*Ǔ=d%J-:䘬zTs=&٥` ݭf:mNZ˶(}]b%kܨ!~V҈J(za Rj/4$N1h@ cU S @rt fTܛUD4F-H%f[>(^u t2:ѿY&jVSZ_\v8EM F |(Zg^nf/-6bk#:[}HJfKb= Cchs:Li^>$8vVz=pDP9pzE&"reD䂭0L ]1JRƐ= R2d/ؗjbB,pi O 4&.xSHSaw&JK[^/B1^cζ,J61\N偐f^h*cvx@i^|\ϝc PZ=:pvwL1 ̮zeEYl^cj្[yM]&-I<;ÿ6LUSJ_(vb|owO7\mƽ&\jAVS+CUN6&WHxBOwLV";8ق^4UW)F̭Uu:K 9D3UֺϰR^>c5Ci!1ӭe[ +mjtӋMJBG:}t>PJB jbdDCKӑN/U}L>Ψ*="+bI*v0mDjڑC-:'F13?RgT,žP;CDFRt/9.%T6n~3REjwN`G?E%Ua-CHQ}ݞP; Z5'$pKNEj]~C";pjArɏq.uJEjݳOkSiKۤEK$ӛLo╧؛uMIEa~wwݓjh!&Ov$B$M|p}vraWBmfTӻmjE[vTlry?c{BXf0TΓT"DSpɳ@; k+OR $YKbEd&gyk Z< ^Y)Ld"_VEjepPp)B3b4lU?h~$8xve~ "#p8F3 y4?48c2?pF$<񙟶^ۥv +@+'̩9-A,9#GDjwDԮ~#VMp䅻$ cu2N_mzegFϜJ;vo}Ǯ ;ZC~|HQ",7ծk:9?tXkI[8uK2)Uyʒ% -)yZ,/Z}𮍧`X Gڢ7rw78|,B+yzWֽhjam#^`t5 F͏*2ezE"]zѥ\(P{)<1 ērjӟ۞3F՟VnInn4Fx>_b4J{%`?v}MLgF hk3=uzA1=y*Eag &giZr`&ԠyBX5%>~ u/ŒիU[2mBHlBVJls?\Ma@˜}PR8sd'pX(FM9`Y+F?@~V[1Դ v?:ziViM݉tV 9k|8Mo\"/6qIv//.?jqb1LExZ#XoxP|a};Mx#gHPYL8) 2hVdՒ NM^ӎhie Dl7\QXSIXT!zxz8dqX<'B3jYҌb&_Ws"q[wZq%AhF;VۚipiImϢ Fm4bw-&sB2[KϢ*L81a^xA;i{2΂ϙnRf 88,%d(JM0 >]T)ڦ,93eڈQ1 m/ЃH aaյ&uԛY2M;(wnڌDp:Lsx[Δ/' Ozj0y?&q =>64rDMpH%u#\)_8 R&h}҈5|7oX@퐊TUOgizLʴ/7> Usua ovYce$>: <}UM*nY2%K\)ڹbDT:mčVrG]y ǵ+Q1dnِ4"K δ8y.$t8.SΙi/W`&ooiv 'CSyhA70FM`d>6Al/%#e̾9̎\@q䡑8ߎR+ :H=pe'j%r(fY=(>qjZ|IoZ@{zۂ_~憷tM蕬6܎+0n 報w.[IeQv8d%x[F|^^/1ו>ЭB~*$7X4nXȁ:/鿈-V>7KR zM+W7^rfg9+6wR*T70ڨ%wt(X쬖cdH3O3zt DQDC:2=C129(ZȾbh#ȃ A`(@HPby0䧡E -* 5G$ ۜ!reȰ+aĢ*g〼]ީoa?t?F>Y7N&Ug#J 5bˀц(&HkJ{_U-pj`WwBtv!TVկ|s@ivST3|.qf;/%L <+&C[T^3!蒌6eಠ!$1Zi [blMF$/x,:3SEn_ʹ69^9ك׊Ͻ&lQ>{bus/F A/!K x ]Zma29YxO֣s*_'39OZGWh@ !5dt6vd%n)s%Ih>*7'\% mIbm5iEI/N#-+EI߼L$ IҁƱlmy%&I1.p1r:,"*g15rJqh-a´2t$%hR (2< @_nHRv1LihJl+Ħ~ljCZRӈAŁO{b4ij'Fh/m,Т:*H3s-pɨZaDmyMq|Ұ m6«awꬋ# .@R O3O;D!GRBGT4R`2R89Yr"HBVAY ɟ i. d9y Oa K&7'7Elv$>iEw(hk'z-g 2|>)WqgMXQٔs 1e8OZeJOh/h(<5>|( Cyb5ҕ5/ëgݼ!Kڣ芻ޙX}H#آwݝv^ZW͝wuvm Jsgf/}vȄdnL3L75$sd`W>M=ќ #9b H?BzEFhL5%RFõ'?~IQs^(j Ey9S7*?!\0'dI jRxҜkEFD4 u냚6ԭ-EGv%n'xN;>HVZ ¾t ?vD3pUpSHWٿlm9od f] 0Ԭ {2(L6{}; +F23Tuz..>$eJg8ة.D۸nƨm֘A nW\XxΐK|yW'ҋ?:7!6 =H~ՒIwL U?ĤGV M0sC}xlFPvFT3:zm~pׯz[cӶr93zg-Nr%iqpWWqy oE7d3Oj N@p8eȘ_8 *$h$AbQF@-5x 0LFq#̠/R 6 z t4&)^"[ؚЂE4ּ6?o`[bBdG/h#p3] **%\6%ruE'6`eޙX@^q❅]Ofk:Yi38QnU__>M>1@}4 #>` vҖD?].fΊ ?ۨH?6P쀯- w; *u kCbw;k-'N,Z~|;È 1y P<-VA;HjͪxzfsA/19y4lJ+Oc}yt-37~ ?nNZUNzTǝ=&;]S؞1:tH79%CJ ѭn62M:4H<ꐰٲiUa7Fl& qvCVґ+:c_߯C@d]e䋽1Pm:ěthH:tI%gΉ?lMX/(?u؍1mKD4P4;1Ic-g@-;4 @[>Ifg? }L7aօts;~i0x$Zy:dRI6nJ{~uѥ}Ky<9hLj&4̄Tz*Qw'g` X#X!4#FPYTH]э=cLʠݸ,):zԡi+B *ɁVDǓJH{:%m=T3lA=*Pp ;;F拷+~=UyNٸr?5I }+;寍fv튖 Q <cSs/g E'ϥWhSĻhuޒ֝wdҩ=N5ץ`$0yXc̀Ch!Zyn ~w^M1zm?`W VVi}?-zoEZo/>Ez`ŜG2721;JƍN!GNH;LvuLojws]>g1:ў%J9yA]J@&r㬀`8p_zzyd1C#@mxNIنomX!h.+V T0Z/0uw)IKnק4oH&# #;Ǖ0<b GY4{c,ףVqTۼl3 Q0 @ASZ-T{̶BWF+= +F}3|\FĪ }bՌX]1Ř<+G vX~XA “L&Q|J=ظlDOеp?w//j w}[Q MƌTkI]oǒW}xxH}++ 8kvO+^&)[}g(qÚ )Z!g~utWW#Z#ٻ3?U 7|09ɧqƴB>t'\b?ϻ0 x)8L.s{ʇP-CmS8;woi+} pO&L $<{ #'wLf4?=J.(Qmnx~7[^sx31g{S)4D*F&mBoP'@_ƛ;=J-Pvt{vzyDgVDA$I6+b0$SA)AhŶiXENjv-z#plή)m*2N%8`ʩE(V8(ŧxlӲw{)S*v)qNnl[KP9u.7teDՃJL^|N3Zѭ|믗=J^]f9.~JT<c V Qd3I.kDZXLŻp 0.AC7W]s;ug|&<h;gߥ~ޠaNsW3< }QШBeu=*n<<-mmT;y0bL>!36H F zV2bT{zm.R&BS:Ub=)?`>zwlGC> FboO^a oT$>+,ԉngy\4,X~$\ώr=,Mh& If >+o0:}O?cJr/ [I\$Lw&̋e}EW?G9Oץ#S>1 f'ieӫ'vc= OwW y&П?ؒ.|Ū3 )gEי~{o^~5^zQr%~+ΡQ:e{sMw6)Utћ+_: ͇\K;_/$Ɲɽ=NxVcg^,Y&{J`8M djh0ٳK/@9- ^WhO&߲|&jg/OLssSA9G۷gGρ?x'Pj C/:ԏqAU7!%A_ӟs4n9.N$̕ugݫ.t ~\HWWü9Jie\Qo0*_p%؁ow T/| %>ɇ2%pb]k~R\w|[Mew> )&v/R8t<]ɰk\gUpF $~:M&NOja$a<mB ;_QQÅzws`|O>z8q|w8g/myw.dRCKlF3!/ .8m7-*BNߥtK}%9}ɽ/%W }GV gFZfXce@QUDO̗%_r|*42ӈ`^ӨVJBߔẐph#aRMTyĩT86gZK$'(tϭA |S:v}h'ZxkUwr5>*ͬ&gg^_/nU{H;: Z]Xi & %"@o$v~%p%Rh+Sk.285 5"(03uV+XEc@O\ Tio"`I SXhJBTKшRqiiŅ;Φb DEJq!cLȠ23X :f`APN:V"ɤMQIu hpHA-qʇ3Ԉ(G̼q!>\IR%=XZ)7X#FLG4v$MNo.sb`hC#B\k *:Tt f[^kҒiІh-_*{PraKSӎ{cXN"d`4匤XK-#͢)hʹEh0b#&,r1=A@Gd@&!dqԥAXA`^Јrsz ~ lZb>8HDQ(@1w3qRʤ 59% \8D Hev0bb(bf4bS0}tԩF\Tɍ:)27SДN 9$KtC.)Q('q-rcM0O)+|LwLd)-<+pcs'9o<F_gl}yնDo}Y&yX@7m$X!yyQxHyl9=&0@4$RC!-GUpsP>5Ӭ-X"x )-xw]LLeQ.J'n_dc4ɶ<+VJC_ّxj emqޗ͖'ff爏[z. {Ӗ%iVկXU$U'*?ޣxNkq%jG8tC"h( ԢږJ7 {'یpy`p,`z[πwZ ;[tW)鮳kn0b򈤰bVQDc5O V毨y {% œ\EiT:izXi}U:#Rz*Z'!Q!{Pn)}T)w`GPxE͎#/ʚ3[fACeߞQި0tTgȜ6.K3 XqRaXr $U &˝yw5pr(gsZ ?:cA*fyEa+}΋ ]y1+ĞCp L^>Ȣ|'6"DaVs^%{W)*'&e]CZ,>!?!n{[q/{~rt <;l6ͦ7./߾N։@D5GG5>?H0}:Vvj.>Sh ZfoV#pJ>xo"m!RQ*jeϕ{v! I9טsAasnOcLF;WWbsP BBtV՘(rzUe%;v84-n]RG֎n aQȄ]3d}0\'s? '?iŇt4btI0(%׊1z5_~:!QGon?СXQy/fo"dgOkUsdi_uuNVDq˧/11tfab>0FEek"0F2+]@#.*t,xsXжk; ѵy^8X;,[!˟~$LQm p VF r.K9L>l?|@ _C?<8K˺->nŸ0N'ڇj5Y¥fU-G+lqVEKu*EgUNC ]yZ?ٜSGu,~2s%zeT 4SIx{S)8uCe'd⾝_ܸ]O@m[Y2A#)hHV=M n4b%hӏP;|ŹV}?l?_m#j p Y1(eL?$E7(gFQz,ߠp|څz2Z8mu7*aFJ7*NΡ 'e6*ZwsJe_hϐYQ$:P$Q.?֝כUB5v$M͢d p ǁ1D!kT\ ]ݟ ,{ mOm-D+GB6 !UʙB,{ -9 j4'|Wl$x S1;HEbbzз }bE>o (o^t1m ]Vs1Ѹ; دc[Ri+Lj9 =X^a\#F^qB޼wٙ У9݄ , Ff_9:bs1(8g Bb ͞0^DT"jыz^&0Eh;-(څ7.ma^ DȑlH[{'@L5&\#Ns#n BBILϤέ ,K( !D53#WŒԩpXYrʼn50D\ hj! tDZN+q5k|HW} >*q 0iHUOͶ;<\mncrJLsf cT/CҔH'EM)Ng-ʺZ  [J2#te,s 繱RLc,uA9 -&wSebMZ6?XŽOu3~=B?x6?14D23vxR(q"y }Z;Nʂm{ڌiOWc:hA"~@HcL0qNd@K1?߄;tȗm񔦞U8Q)P+@6a1Xg$ADzLSOl*!avAS(ZxJRSD)(3 r%1<:GZ >r*rtІ/򴭧{sdV _` |(|/9$cs;= 3ԘȽOKݬ?s齝mx}riy{, M=U?]?((𡂸롟*x*hIWZ,k|.k<uQSY b,Y-hV [ϵ6H,$FvW*6Uαci% jT+41I~X3RsC@)sS؅EnL PTzB:8?.B9ۃpC) gl"䈩Fj,jE뿰ظ|sq2_b'F.cU_onjC `PB֧ɄT!x63B ;TJgMu[1Iln 0Kz~K6P3 Nti@sRNDhВJ 0H+p *֡~/8`u(3(\H0ur/(Pcv/v/+-8 W[ 7S &P|xLf/|?GO2B|`N\欇pzČA5V2 rփT-dCb:bɖ%8g}t('up%da`b[S'sg;=6f BPL~38QuƤ#$ma֩OaUe]|Q>4Mp!L%̩68ι2kZa!r& #.T}{l ݱ6A+FQ;FY2JEYd%9Z;LS)A2|`0.<˄ dRᘹX1)%VsQ>E PIA;6W y˷TΆg̤!#51Yf0rD(RS00dR._ PJu,Z#fzLa6E$]S`ү8UaƇ+mN-86"fk3,K FHR(I "sb)S8pK]* S(!NWio A(mU>RI(8*3J$L5̉LX*)N-J-*ńUsCP%/ ~BP2vd: 2,>WSlmO~=v@U' !OJA731Cj~\vaV;S1r.G,f$i)g—@ܩ۽PIƊo@ƱZ%cjklM a*heOs6Eg;_~`?-,\W"ꋬI6`od+ m7qn&B>M^Z~~Qz`R?@d!zV"A2O p:|s]i7+21;S-GF8$ybb²< @I&i(ūKmmMK$ K'xϝY,ޤ3Hdׇi[T ݣۿ`M%҄Qڃ.[1>#8ɐfC8G!)^˟"ywH 4ZwoTEc؀،+OJ'3DI rиi˜mykCUI ,|(~ZIJJ5W5nbAߵ) ?ƹn1#xL.ҘNJPNnR(T܊znƓpӟӚڽչBܗW&f!Rdb(%bՑFsOӝ%sc0uJ|릳HRx©,+iauqRwD+xcFi㘻 рD@g2#zOc"W⣏>!V/n9ųvy/Hݱvǣ`G[h՝xǹ4Lݡu G[F=oRԈEi yEwqeX޸eV&, X(cdUDYo橕I hҤ(Ԥ4sY&2MRAf:E,m*v):߭oA  ;u[}PwhK E ]3$}-Unp9,;at(;qwP RbuDݗ(;%ȋ9)C(#c'Z07WVgksqwha_5aM;Gx3Z#*$4v:k昁p 8 *]8sY7q=ZFX׃.yї.@=h90#d>KшX@SRh%ikS sk^ո.l_ٗBxcȤ\yjOp*ETU1۔`/Aq: OLC;sP#S+|OoߘT9K0`L4˲I2Ip2 c  H!-N 9v1Ԡ%jWŔa,6l.Pu`UgT2ǰ\ ±ņbDV6cID2A &[/3^ՂHa/o ) & g%P'f#3xS9F,)3i^Ď93,J)ĕt[(0[*EFz^g"mVF3)Vi$qI6ȥ;UE*XiO Y|TZЭ-$e*Z-(?P]Đ}G~ Ba@‘@Mh\EPGH.E=đ{G}yS:KQu)k\E.Z7-XW1Se[**ꎹvGc[W::>dŕiݬf s~jr*VgHЊJՇ1\LqA)&3֒R>6;TuȁN2 +s0<XX~-$#EFw?Wa)Q;~@J (U,TGQ{l6%ȍݑ.l %V9o!jI'A(yM; TcPJzΤիWf=Ɨ9Wlfgjtу LveXxOSXm~͛+nͻ25>?['"qEL]8BtfqƖ$LSmb_.Agm `EPXlJ\v K8c<Ѳ=U bEJ.&!#ϚMˏ@`LU#c> U#c5ѷ$fo#g&]_.d3G/w($p& O矋PΎJ!]7}9&B7I+vE+T$&GQNjy?HWQqsu &6Tb{vp җZ4[Vs+~l~$id^\=6ya7 el!᳹ԑ3S;or:5̧Sp̵p4j7cNZ1ZCpFBit<+MBRDX೧ *WB$!'w0⫑DA/UadT. ="ʆDB\#?VGrb_1*%bYq,O93բ_ 4"?[W3xwq?-nִ/䈠Ŝחi)"6>%0/;2jعxsU/O2аM0U%d2ǯM@ipxo!7_}(ƽνrEVs?o~ ZqݼxFQR[~ nlFK;^Ɯ>_-©d%}#dA Ļ ɻMʿ o8r<ȞL3NNq0EyŁO(D9ax7/Vwu s4)?  A;g%gL#u*șؚT %\C 2bԨ[1A*#}J N7VǏu`iV(!Fxm&w s㸚͉>lkkB[>6WjTHpWo>^S?|~Y[ӗs&^AEH6!rF~xy5r;oVymcP 1B{{t4aJDFwfzsrc78=Nk"jJndj2Gn1J r8PкO.[y&F '-v+l/z1 LF|7A$%]ͣ! eJkI iJhP>HPJ.y k wDIQ}0TZ6=H¹hw2R5ǢY͜).wq^qqZ*:'R4~3Z}]A"ike']"f\Hp$)jFFe3""E2a!C,(GUÁsQJ~^gQf'P7GΘMx聘#ă=/?'#(ڽDRv%nyCZ~Ivlsb~O\Tʔ%XK`V<w&L,1TIH*IO:SyEq)Xf^C>0p@0 MnC+_ ^]0!J >htYʝN&iE$Yl7u`nQ]ݾ{C j{˭x,'[-?pܸR|mwKlĚs;E&g޵ݗyVa:} (r  > 9iנQШ5ʍlz7JjÇ,dAh&-%ˠ%EYa#f!2'9hT|A냢!c ;IU$gmuՠ1 Ɠn0ؼj([vJ.윬h;ZdlK2']7~CFGc$<:xĸhy}ՠ1P!,,ڏB^5\B=k[Q?A\5fu~9_ߡ*^&ηG^9d@ݥWp(3̧)M0gi¨%B$\&u1\ W.E7ï|7^Qxdzŏ)`fPcUXτ ~ )g(WDWP{N[u,vڞ m,ɳRzUpTݯ=nPú?$T"*txH!&gqowEgyisCWb:qZ ѽ OB D'?vvfjirìYwc]75Oƌ7EfPыףO&9ӚћՇd} Q]u~Gs+\82 w]9{%]UP$Da L=CT ObE1YkQ՘|L6`p}0u[jx$iОCzlWR`olI!8d~Wwf<TF+e |S¨^7nPcWט9Pf޹o3N:Gjúe ?!`e+[3 Þ` Ȱ=A#w0zE]-%lݥ~6k5CtѠ慺]`xy.ґ/Wu y\۫̎r_9֝ݎ{:DBG/,n;7L! j TQu5P7}ǺC7nGb9`Zd-{Bwmu 7ڬފ}6^uƭ_k#,@&nHC*ZYgmJ|y,%H[5/B5 %}<Ko+;4ަcK.M`UW~wvgiV.YJZ_RnG·%)`ݔwImJ:^]2gمh:(j66w~>"Xh~)_>骖(/ %*s%ugջZ!팵K@*Ĺd4bMʏy?eZoZUi Ԣ(UԪdSBEnQeUU9WͬXUiVWI6EŐ-^ߴ*УiUtlvu!s/͎y(U q|w !Q!㈒VLe3,t0.{G͂3: j!P2k5,iGr1$*,۶Pw1.Gcy^#rytwkc쒪:Dv7]'7g^Rc@ kD2EkNؕVʹ=+z^_+;o+\!,O5oq8 gBfDͫjK@UJ{-~f t(`>}2KqC@rFy^s0@;~ݚT}0b;ON%5dLhyfir3R9M F4Bp"Prk yq~s%p<Ҥ}h_c O}ej^KwzZZ";oBVt>][+,l9ٞ2UV뗤Xh4 1 G$Dz_!!nR"i9Q`~ j5d߮^}M&hsvnS 댥~oRoԇ46LTn8# p! @JH.7e0́Q0.]m$,,JJ R޹f8(N|"Wj1ljQ8,JLXʼiGy)*ܯܮ\A}4w#~-zK8cB=<b@JΥBj%$]7n>O˸D|7RnҘ{@ h(:#!k50…3 1DܻoP p ;ԋ%=a>s+!Z77:pClh "W̽qcR{ۖBo\q)LgK.!EFԢ,Ϟf˿쪎0zJ &-htIEF Y΁Μ X=;#VD4y#a9\) q֌-rRDqkI$ri r>ɍNfiTJx(_ NNjrym:g㗏p(I.#.hZ*`0a^z+kRn]=BW(#"]ҮRxp!=ecѪ1!M c넯xǏͽ@ŏP Dbg^ 쿹#q$D[+$BH&=O@ R 7yTY ']A;0wGIJ/"ZI0\kbǟgis}UJ'tNJ{(/c\@IǺ;¸y/ Vw?o^p{%Cǿ c*oMMhBzmiwo ik܋ wZ%&cN9X֛5UU'^Te/MB>֤jQ%HU -D*?|9@w nd=o[ /%8F^>9v DD5\o,TL!{{Sȇ*YP1YS@zbv)Mf-aێ+p=EELAyGOem/kZAIku <NJfz3x)z:Cll?YZ,RfAp9nF8ΗsJŤ l2x21B^t9w~z$5jBv!g ML2;WAIh ͕h]P/\8ଙG55e#ߟAf%kC;B #O0Qٟ/g#|?fՎj*0DQX@[. ФJS-r rHte.K'˄ b_ձ ;!M]p(~>ٱ l'!pcA%0uدkx'd v^6-<0(枅CTY  .@ ^jXOǔ~:_q:T!2Lɸ$y8xȊjCrԅ# 6(^ ÅRʝ H]&T!+j< q 10 WBT:G&H 6;0`5Uۆ@*ͨU n&U@TΟ@r˽5\Ɵa>=pOïMW4s~,{UmvE"F N-Ixj܎MR&#Wn{!ᎆگʊҴ}=?%U5%l< ͇g3'\M"O$O~#Ɯly]uuSqs3_Dyh5 A}kKA/72k M](C}~-wuuR0dx;=tZx*P=k Go#^TpQ"`[aPY96 ;cpTf}ŚGޚGޚGޚGUk\ڜX˨F[iPJsKsd$gc_u>[@Iݍiyo x!'u^`P( \s%RZ `b0'B(4C&XjnݓuBk1z%Jpܦ>tEv0n8d,3 `_wLPXjH1aգIDr`D œ7}3*]Nsf榌nJpجPaOr?5xtyId#ΓZd:|?v.֧Mx^g]V<3yq?YTbtw{ yMq?wb&~l?qG_ǹ-/]Z6Y>->G(Y F?2[\YY-bh3"00eSd;y|W?O+d]@Md~~\J9 @"sY_ݹtPY; ܏th v+S36Ȧ.Y2h J)lr eNr \1T`! p" @k)$U]~Ir.&[]M1{e1[wI+ῌLbQ.#3̶7^bc) XH(q\r, vjk"HośiaT@xBFUQi?LѨ4WV" 4Dr9 p~ +a +j\ q hqR=w뚾Ԓa-I#eL;W>uy< $9EROFAv^uicA 6v=G,-"sSʹ7k$=oU8]֛LPջ‡&|7 /]96tH>{Jr?j&)Dh"#7oEZƁ0kV@bP҂$F9`dM̮Rǔ-;盛o'v]dL~뙣39^א"CL9QDH LdyQb+I yP Y,UW/UIs^9$"+|˟FQ-f3g,y<_=] r'農w{TcJw8n>dyV[ȫ,lCw*6YU-6}2Ă->E^}MIK?~6܏ۈTF0nȟKYBLL8pַ.D%/nݿP .n _m>MN%RXc ءjPѼCJ7*qBLX- ؙBqKK"ٺPqTM8 BKe6xC QfHbIb 5],ÖPúB.{sѾz }\,9':rh>A ƇT? .VK/kƀ_z"\=]Saޏ0S SɽfY0Aډ҆)!3DT95y!?0EKVp1H#{aOAUBgDr`:ieS PBgÊ`lڹT6ceƭ0Hj(fqpc`3FIçj{I)%M: 0X|laT`0<1%1T\^ k%D5 x^bX>3ai=tYHc\->Mœ֩_z0^0A+r:͸ݶa8c]xbG"geCqŇ_9K۾5uWj-htšf{ӶsHbk&8Ǭ-)VR"0mUk$@?PSMNJeg:jV*Af31u:6^rX|NƧsVXK+{e+p0zYL' atʌסU:NcNyuI*elcq l\+Q]P]&(б8`:VukkJjhP=H] K}_g|-ߢ)n]>jQD|!,(~UꫳA+JiE1b(Ob&|ؚoL,U!RT7}I`w'}6hK/e"J%XS4!sy{..,-wfsCY.Z`x RtH]v%{I bTyR~ƳUaNέś*Y!V,ì܁Q IPzOdܽ{kUEUa)~V& MB ⁇Pf:⼉2dIE&< 4 FU=Jqˉ+$ GYxӣ/ iAA7aL%\? -A xSOME.E O~:3TJEkF/3Rklhp̯)}X#Vlse™ K'+]9X ˆu \888m}:폇ZpR>uNw-k):n὎XKNײ8c3bJ{!Cjn8*6(3)܊5Nt/C|kJ)8%۹[[Qx~+?|hh}7~r|8 g}_xzpu$xAklp[c&ep0/&ӋTAemq;\C4?&lTXrn44#!_&oklh7M 햊AI}Gv>1ɓidBj&$ $:v;OR1":﨣gF6[%ZU5!!_&ɔXnK"/AbPEtRQG/yZ"ݒ ݚ/\DdJqnh7h_vKŠ褾v;_:t[2U[hLQtk!ڍb~[&ƴwԳζ<-t|-[|"&SL`&k&2y0K|-=|&8_ꎆ-R2XߍiQs ') b.ZqF;EE)#Ux_XnfAO)'HĄCo4wS-Mad&snU{w75LM~;Xi Fezw5Z!οn.֏^cχϷ~Y S~~^wW[Ir70/p^;YnϷi3@Yۿ H73}I$$p bpߓp0xG$.cRɜ`geJB$ت (AxD'h8L8ٺ7tmB7O/#܃iw{oE7`O?TG]H{ud2N$6/& w0j0n4˸on87ۜvQQ}ƽ0()p]kLϯFjl㴸d8cXf״t2rC+(-FӖoB).f<80Wx Oby--pX.IJk)HDF[Av*s;FkMP]Zq f_1f#LoaLy`Vuܹp{@CW^rfv$*p6}Ѥ1k&R\`rE R8+R"B]+|Ej HqST֬wn""ťV^5x3z6^uo4܏ S8.N}V9^Z^ܶ~S;p!BЂ!b/nBDjhd`q+^*[jz Mti09TS>hldY8,Ba9M2&6L:pr52ld~^#E2ê(J_\к"zPMFiEXiQ\F % *q].]\.Kt!*j3PS|j15^H%[[:Z]+4`њ0Lp@!GMt,kETQ+*3)mi(]77p>6uszFP4]MȎyjCXO'䗯I(FF[{.n?^^YDbבC<{-o\~JٹIu@ 0qBxF ;HaTgூ$x*Y2N"_Wp֯)Cq_Ud\?uv%w;ך`~57kr7jC &yKq{(/K2eg 6\fQ  93mbq\h *_|EN99l䐌$zRm,֧iiM/<QӈA"au.o{(^J[jrݸ=}z3P:ׂܯ̵ІnZ7͛9C^O?ވYWLV(G9qZ$8<@ņ=sqYktVdmbR+!I#o0YBkZr8'6GAThCYfqH>"(@d"vfd02޴k§8| (pR Ej7!ĉ(Ļ`]1c$D!y|ѧ _@A0B 5,$d^jPf,9T*$1HzE`]\31`TĸHg9/I"O[ 6$ ͵Lx+1XzL>?<4A/H!T.I9&i'a["J} *ƻ,H0޽1s-鹑:,,6V ߎ}〚x =aZ4.kd c>Ǒ0-w| $U=``G4WOT$Et0gX1[T]68.s7m&l76`<[o |RDlSwg,}z}xw+?Ϸg~><^c75b7CCLA~7̧2)޲4h떼].mЯC=)J}xv"^$sֲ9!_aG`:Glv/g60@o2^żewcE]'gih!~9!TnnF포 C=Qه؄/>=jtQҘRa޵6nd"i~ v? v4Iv3hŢ6E$`VP%VIlR::B B B~u)ϿJai ]na^"?ZVJ2ɋ[`+Rot$&^X|d:?r߽#ij"H}< 肴'G?48%McK4ɴZ+0Ž 6?6[>!8%eO5~]}~_;r*S~_ {2ЅO߶@/!b'@-ZbV\N͖lot_acD92eqev׫Lz!!5Ga^ђ(CPUw.ĆWH"͍E' '6a0mJuG!=AH*Φ:ZlU,3bLjW~2Nג QY_Ԭ;~.G8$~i5 ًHFElst)#t>Ϗn"~ $AXFZ/~p6Ͳ] #zV& >/:o9v ߣu ! [AոE9~Lg0x`QMDŽ sŶx; \ B%Xxi - 'ْ5/–I /_$%.zK|*<{W+SM KdK:y{ uB*b$67- T ?$0ɪx6L 9\j @̭aa eX[RTхCS_+t7Uyp%\r1aE+"ը a05Wjy.ZoΟF6rR#®߹Uh gNҌu'2rUcZm~s'2. w7C&NB/= h'Ye!Ɵb?Jȥz7|xf|ƛȷ}- VD8!Q堊,0߶4`3Ac5^<+^ e䐷# b0-a0'rɔuD5tjHa #!CǼ%jKUSaٛ Q"҇'/L1|RR WyUlzae/ꡚ<FJDRHKq@: k -팟vdƻ`Zu/f;e|֝,AL><0Jܴ }=3hQ)cw8‘,WWL[:ů_]qQ6b!b=$YB℀R텵-b?&2W:a+ݘ}N,7FagF]uXT;L]JoI_P訆pY^"N@8$9fDuv8VsH69\ @zȝ2;C FvapB?Iz@O _MX=_W>@Nv*5c2|W7Y֋!DQ*zP;ip%ʙِ@(.*xRb&AcTx Ԑ t d|휱w{R0ld^i(!&c!QyZ Ot[!@PPW;|$bc9vGW1mjA x~(."V;aϿsm>\Y^(s|V샕#֏ b=*HU~t HmcR-5_mծ;D~S*"vAa1fI,!vP !׏YbȊ| ޯә3[| َ0M~$Ũ!;2FpeB7_otyGMfze?` b=L0#ۿ> q|-mi\'XLu Iӡ#w$q B1JUkfުFuw_ nkօ*ڲǞ am;+(ރ~T0')gBsγ쑨TA[sytq5%ayj9Vp"i6_띚^p3[E9 @.]m#I`L&\*oAFv75M7@. wѮr8 HVJk7ܫ xѷ8kڠB.65\H9:2@ 6e=< T- j%_ߜnQg#q3^_ dI $~}+w`!Os9zرm_mPh`I)#+εׇ5H4>7|Ą߿GQw BW-hvhv*[.^Ԁ-@SqKHQ*E\RgGE 8eJόl=wPǻJC:jQ`t9'Yfp)$cl"sgL@r@p᰾cz-ӴqG[Tg6 YFn3Us=['z~Ѓ,p:qUԫ;>HK "rGZ:^ hPGƻ>z&P(R|L]< ǎ6*WY9Qp,.h(ۏw2-hcJ|F A\. #bdT7< k@ 1O4GF.T:]F41˔ R? ̮c=Hᐺiv~6Y/(`sΩ 7nz c!ýw%Gh$vqtLQ1:j3=ܚ:$ԿBɷCVoRpsCCrj }eCTA@ D" &fb vU{ۗ ƪ>1-sdχC !Ŏmbj #UT<]~`؄&V> ы'~-04!ׅs97CO)"ԁ1*r#f&.:{RLn&AG8nVUGa3GQ*pb8c&\:׷ \CL2VAFC?  _CΫ@.毵ϑЉmD8E1U1),\݇-wUp@vsIyFAMt4W֍iCҋQp1 2~I7M!B*En[mP[3hTy7}8eYSAb$l m C]g"8Ǟ@R!)n;MVK t&k{$T.j8l,1- SI6(60!`5IVP{v'.\z5b$}= tA]%j 6ؽ98:ݰP nhuwWo?gK:زEoK+r-݌VtaŔG`dKjCK4`ћzUwd;3' @!JBƍWC0! g sݎƠkTTrX6i 𝳊ݳQ6l`3+= (BAD: ! 4/A@-u*N @ DbDD"@iò TR>%{G89 YQaE}Q~(0.6$B#x(e! C!#9glZBTv[ B{VWSRۅ(}'PMWT2|@4d~"aNLm~]&'m,OjZ'DK۝=USۗۏ,t*.rHkåXXG6e*z^K;x!^t 0pUiTmnPgXM yR+5Zt|~jix S|sKg<^tV>4zRv) W?mYռMm8U~c{Sݤ6@X#M$/I忟)$A(I9 k4@_Nǿ?SHVNTw> F0p)V!ۿ8Xh܂"}};>}̑r9:^Vo tEFL0 L<"˻[;r-Cvd +A;X]@/*)=y[w0k,RΕf̹xwRAf^m6bT3BXgAf/kKwJZ aPo,Iz!L]z1C݋ΚR/r2-u )?Nż<,GUh~39v>y5_fsr;z9lwcrl0Gҩ^?OW/v#4}].RA/\.ʏ@--,,MT-(^**kp+ imRul}%gڥ8'2Ź'^vzm=>8i0kTOW#Xk 割S>IOgpY=BR/sUsJKEFѧqNƄp"=p5'CIQn'3_P~KYϢmcљWޟT>߄&{oB߄5o~جIofe"ͮk ͌ A*((N"(LbƚB7cZ82Ɛº7s^y8˃TǑ1@0:rTmjGuNݴX99xRsH #QP-Mph3 Zxh?, TYxh;GVX D h~`Q?ǐt0U-8 m~U}_<<3 ="hպw4GT"Wճsvv.Y Ln?uu&X Oy6uz!jSu*U;< /#[M;k.@zM9Ɗ*MO4,~Ss"?~[+fRa,R~& `eBT) ÕIϰd)$b!$J6nY ZyV?Yn3&( N3WOfF+6$YZyA]5˴ 3u|*`1„̤*ujzT|p7LTvߣ(;o'5\u˶45F*`2ѢbNLˑ2Y&` KS%W2<"KSVZB&@T&v ,KԅDD,'+AW<5FX.N %`iT A[%9l՘sس ʹNY yцFuTg5 q:XbY!EZjRx.B08'p:4J2C[ l(!if]VaTeCx^,߿h7qzi -w1$2aOC 1U(2a OC )ѱ>Sćx2@= P$.Ʒ'IYÅC!ww,pڥݸLL`76y[ U A•__nK춒8wyw?&n6;nL[Slbk(6/ݠk}qߜfFsҘv N )Tәd XXIPwYt\r^d'<u]~||Ma>H_l-lhiVWF5w=Dwurh@ZQ>Q }ݢ7MA>x#Ϯ7r oZXgJ!$vv|vm T /]YyNK7.RŊ)l rRgl3J5:2iuap "D6S*ME6Rqs&(cʕ^Yxl)>]&ͬ5U#$Pbfw֌p5rmyR+uٟo}q*z+gߚr1$O.;2e_KB)kADs}[#G?VYwθ,So}=ZXM.Hd` &Pgn#p5k& p9##);xIjdWy}9O.Im>& KͿP$z@<%,5øYk/n,^2sS("k҂(hI'(YNZbRnBk|rH2sX+Q&MOz҆BD.ɵf*:^{|rPA|VX P9\ׇt1"SSu}*($St6k5T3z~wi::N]sŹHj@պBrv+PhkTz}k̥簓弦D*?L!Zh`ЙnX\$l~6eItԢCh܌IDi-u]4e336Y?l2S_VIMj&` IsTwOsC)$e&O\5D&\&se 'XNp;T uPnd\$%,OA ԨDgYRiɋŸ 8/k%Ep'h>R ^s-G <23EO߶'/z(p@B H:1& # ~kY!ßV`1;ǓIQ$ E8h]>1\O͚:.Y1`Un/٣# u-#czOl5v+s1,;~8)t.H0CvI;'+# (yU'HWx]'5yrŔrX/+p2NkY)~7j@XwSAizC: aRW&.9wJaR)qyeȷle߯VGe |-x2:X, s-n)GWף7U>}f֯KҌ!4ۇ%BJu1Ir=U.1( w{y=f]^nA%5R(eώ";Hqop[]:8]:-֖0$(8b=a-s񀗄.ϝ7{(5$ $x/41K)wVqohu术3hCL ey2պ'ۍVW 鐬*.t 0<ܧpYJd#)[ԫGgmHʪb-2Y\wb8&6{FmA>*U}ph|3 ~lРLRb+]Wu3ҚM6ۥ#Jʚ 2>ݲ8 =A)=wN>,k7qc[EVVXq.ГM9TA)NFvyB })8saXvD#Z5M{P/5)J95i~ǪG?хLe(rU3Q&cP2xL@Jy & Wmpx>ƹٙ sNK=Ufϩ<.`:s$bͳ 8sR -L`?|Tz$H%.Q^#6L۶ֱ/ I,Ć0^Ҿ Ry4{5$dpH3^㍬*rB$RUVF0ƑJ"zEiK1oP8l9 wFq&l˖1<>:6ٳ$أq&樄4 |ih*hfjaql,>!_S][?L]l08AG&pl (!\ {k6VM/<%RР׹J&X'lrAt͖=^byи;70TGQE!&QEM/DGLg8lv!GgU&zX1QvF`;d'R?㻼/9ଗ:фRFQ2rìP{FBW>PVSi`J[!9!veHΠ8 6mB}كAd%$%"'kP|HilYo-jf!T3(hb́d/@h-e^qiM \MwժʄoXC{7̔ېg(Œ*7Ooa|nEe<ϣ/;fDgћlZ gEjd_.m:DO3d}}yMV ƽWy;0JX/8R=95^{8k7ϟ@J{`^C|_(u}/@`;g et[xh MhP^v<t=p->~cjunД˜4O" V8+ =u 5t05r:K~x=zKhZǦ|&!*t?;Tp'JA(f찳jU5MIճM/&jxVȱ&ԟ=d 9g FECO=:SSPljJ#wt<}ɄIˉBC+'p&u|ګµWk ^-rb72Q9S9be2%2D^":̥Znw^nXYӉdP9%"raFI;մCTT3K%c*J^RY4EOH`C-ɭVк^R(˨up‡F^PT%6mHƵ֣YEzӸ9jӧŮaxc̐ :l36>2` I`{s9L jLa@+ֺD!̤&̀S;sy94>SLco~g]sƻ2I, -#6:j\ Ә,SrglJ[xSsכʨ^RW'H9cgß׿gVQoKA9j?2u0Մe7cʷ&$N=Z#QI'? iu|^s]T?ZH?eN[pb~Φ\%[X9 +߰OnV}[R)T@;%RT+9o1˲:s88 lO:7-)%ic[-dR5Ԡ-2zKTUeKf_daG~}|jB 60Ԗ7KD*1nOoQ~/ /ǁuӒNjr$-w~D&PĞ_Dz"!eP\LJ)Xu3mx*=UuVؗi}(k}qP AL!JGkS$ic#'< W<^ujF}P?J8P Owͼ-w@}z|Vo}mqzz ^-iG,=f^xa^- ֜.dsupEˋ3zsbV(@F_@(Yeb6%r(r FJLFN/x50 )P/} xճۤq#Orc!;}rSQh T(dT)4FHu%9NJե*.CU1%Ԯ8 U⟩+@8EQ5n a 4Vg򢄺O|t?UԃC:sTJdX!)57/ys*IaM|"Bv~Cyz(#DK }_dxm q=C7M j5GFrF}ɔu3z1b竡r1W+.iFpzk~]2x kS|}1ɫWuL.. H!O{S%G@`/\_o\ AF%K<N0L7mT )T*|õ`t%Z"r?쁌AKK6J4A7 U}`Z |'- Jǻr. 'X|+#ǽvͦnWg͏ H,J^[bsccBwc mE%79K,Jsǟwo+ byc-?ٸ)?"F E!cM"GknM  r+I*9[lxoYQ%Յ lCVPz>ް5H >fwc8֦qv92r[;\8ied+vj>CwբNZ_螠l~4h/]֛Vf%f$aY]/9q ^IaK62_gIB:lBX4u,ҚbJ51U"VrNҘBײam9;<d8v6}?>Ջ!C@Fѓ#UY^۪;C;vez4 m FO.ҌAgmgzh&M"NJ57HGr3hb;ӣUCfkw&/Jۮ1wlvE/D%!Rq}4[Ou;Q̝"eFE.3wUKP&9j_|W) BsхO>۩*Qm̵q['@ֳ )W=w`ԶǾ<K_g s\ rcoz{<$`ާoiWV={$ ,^Jj&#`$uܤJX-Ðf8a2"wM{؈]IK0cU7t<0tiA)i"LOp@OqBZ)ٰC1"11{tUEAPd>>} L!{2`)`<:h%"ǒV_gX5QDͽ5hݕ3ã# vi$Y{WBDX$(3X8R n_vUqc½>DI< HN5B4зP8 EWFS`/73N@C q{K=ne]e˼+ ,?ΆrqDve<>K-aSdZ_͞VuXϧr.Gz/c1-aian=y_[ ZRbGX#x6ns78=Zr_ 5!X BShEh/v6*&u $$Irxk'gZU=C*YyAmW6Egs,Mh>33>b[C!PRgzuk\'-9%ȕsɠSV+X^Ɇb'P_8(xdNgL^^|:,1xi> #L:ɭ6zHQ: y;XM N/;vZ4`4aK@¥僁 US)"lZ H1H=P#[^i .2xД%Rzq"uLq/hG?FK̉45d7i 1"{yq:Y *tJ_Nٻ޶r$W,ɼɪ~XL24zvf^75Ѯc{e9ݽ[d[YNĒL*օdY@"4P`TPW1i@&S-dd/PU4ܻMGmmDdb4Beb/P&g)zIBd;)Q O{n:|6)I*'*JfoEBڙ&yB\F f>>v&u!}@z_-j/[Eq21F钖<5tE@@,7TQK 77w O2I >ΆA>@4ȳfCO?ƩפC7a=wBQ?jl[UbM0mT#,u+yr"ATjE 0 (-0I0JTjCR9_uUj{[|6P}bR'oJLl`m%`-{#O8$Đ-jU$TNp!%TT2dEi:AɮA?ʆZIqG5f rC7ȟ䞿R7~<;y<靟~)7<ϻqhY=!C 9E/2<7V{{x7G}(Cxx߭׳ xM*A3Ϊpؼ0H#;>kp5jMqGX8gQg K0R˭4{lVM{G ~=H,i̟>7 x7kh: m@wyoߎQ8wqy`Xv/h'vMʗX;osl:zl,NƗʑ9j$3rtFF/g_mV83dZn/2.ʳ5s~';P b)|O27j{t ƨwgZOGl0ŝ27&WgBfk9t*o2n^]\~q\˥ao{N 8(^1<}=L.~7R :˹lx񆧶L`,& [=mU[Z/+{g+0RcG6%c;d_bͳbˁWe6G9$Qŕ|=/!?\3'8kVy Ŗp fzSb2ҾAva#(fX#M.C 5;))X*dUbvD jf4mEϋg׷zQ}jXL+Ap՞~ `}<r)F.:M|OCbU[M8\1ɗRK5H-i /ujvṱv9ktǕ 5YA]1i4SZimkojӻ' W_>DN yhrqY8XިG!َVҙqMɾ&'ҳ_aKo΃alf'g_y˫vxa'4=9t,(HdD\ibqcdc_nI>o}rnn'HYm8B:ǩHnmbSajx.J9iQ ']> {hhW8 ljvIz0~F&˩wZĉ#=mr(e[]Sx"MPۗ:u:eIB(i JzL$)hoe DPE+ЃKqK1: T{ϕwT s Ln? ' ldL.&w'Dc5WJ[ub:>-);cP,\-}sm}F~zF(JeSզś6}[? &kä1ruLx{#Im\zT5χ<]bt$.S`/vaTJYm)u )u#ꍼGNi4)9Q=9F7LТVMD͗giQ'0ᤁ`9&MMZ ?MzK6i }ؼ~FLaH$ gsYSU#f!w[!t3a )-TQ~ CVJlVrѕ7Dbۡ,c_=G8jj^(߱v}] 疧 ϴ<ؚY8V ;d(+-Q_î /A`1>DV!o=ꄨu (E)|0@B0.JDͱ`& "$2 +QF`<촳y Z(F)a3RRXLʑ_yBl#z֨sThAy-,CT螭Sg_lBwm7B$%[ZD?N4 e{Nn1g: Yū!C ^4k`%Mf +ZW"ZVp2`)H )E.Qn':`o.-Ɔ^a6nczȍ_`IY$!:/'kS 4oK7ރI=V/3gװut,V*AkE3}I̱ysGtEdEy<juEOAW?|ةYIqV߼][# Co w!:URARK/Kۧu5әBSAӶk.^G\÷^G2_=Y"T֟tiK.S:&#Gs2u Qxb4 tT?^xC)JWo~ŏ2)!ZOI@ܸWpH(%劦7<{޺笠l۵I`ɰعC SAr(n 9鐼k%:"Nظ+z@X|Ee,_ո|E Xtp .srs]n]KL|c˛봻w[J]韴qFNqQeK9]Xmaoj03"~vD7纈~h>QQR p@kDS@?So>UT?N>URUN j9kKe"x^/yT R&9IDDƱSJKEMi 'qL)gFHg6v$;av g' e"-,R4B'.c%H*=1AP8y}?8g_ NIk''u)qJ5{r{(1D%^GY jI=PJp:ZAX\ Z,򒤳'Q2.ꒀsJ=zI+3AUSe&~;ߖׇIgd*n}/\^&ahf~O{*S@>,74_2卽mÂ'$w߿>o}utM--?'Uꟙw33K~퓀[dK0%9GUݟ?rBYx?e?-*3 C֤ևP/9/KQ=hNe.,2·c*e4ENPBK/i.ג=W%˃VELX:Y!@t=R MX˘=aqTr69Ι$'גFrE+YL)Eˢ iG;YK8>/s ʈ}4h^/+:qW+04+1 v 2L\{%=ڞ8ADmdw);-p:I*ң' ?}$&<,Aei ($3 vhxCL{:oa7ڀ4Iht;-Ԯ N%uy?ٞ{-1 )ʧg(g/N#f1S?dt$g ^:A9](E*4gSN8=ԩ?%L&GQt4ͺrQn<L8],F=i9q4!qG۩ }Se=>frT"0!p4dՠ0=n'as@ >Ag=ڥqD % _Hf-n[k3]VFJiV[JR@xFmiɄ5!/-[ @Bt8c Z A{Tsh#ДN~t˖[ڶB56 FT bk2r z" *B1UBk0ny&ZLRjc@Nr<_}lpuDqx=@- T,w RI1%N݀LŀQUu0ؒZQ1hW7~4MŇ.SR35QeI*ie-йMǛQ]ZAu;, m(sZmX>!s. .z*FES+KIĵHX?+ j5s3k@c&ИiM|y9x>5MEKIrR֋" X-;`9m5H m>}~ 1+ms,98D9AA^\.Gw}˿gC\*p|{hLXް@g4P4Ak"iX2c0GX4՗1p|#he%9 :(qz/;z#AQ(".PMLH ZEf3<Uz8X9]j&P!5҃b }ѥ^G"G"#;)$a]$]ԻGWIe2UP\#h~Grhڝ2oũ*jM*jnD^F0ΜNGʸ^췫+4:MIj-{I3"S*v%C~O~~s?=y _󛳷o#4$OT.n.ޜ/.>ߥ u(̵ȑ' EUI{=G&h(Q(niBE =6o{R؟lX9"3o4. &qg -mgڝF/QI9PiW?W`5ùE-KAե޶LtЎnmen҇.QՍ L{F=aTeo=%_ &c9ӣAYT0 0.zӖ\yoSR_gpmjso޾q>Q3:eCs/.iJz%Jy+6x/\жQȢ2RO&rMO8k9ߚ fZ$WJ9Ii LGb7UZW7,q~zdJ 5R^vɕ2yT0Q'ϼfoeP60G=;{\3F7{ԯWPY)EAH*V;D5)m^̓4ZJ=xEZ80iCHXeOLn`X 60=]Bţ[ 4B?aQcDfKT^`L驠\*_Bc-h({_&} @"_x[uzfwAe-(_/Á5>b:;Lv N~ʹ_Ǐ.OB~BWg[2ńS:€/e ^Y yH N dVFQ6͉NP993Tă!k8cw1P2M"Ȩ$%[&,UA DXjjl ~77Md"-NSǚfxHjFd9a^k ѷPJF:. L k'}[엙=7euіT]IRtYvY'pC$g(z*ØvI7Xrr52O KIc&5:d2O33jy0#u_ |7Ҭ5xW;'WC{[ݻ^/< };jwhGoc-JmUofBP0fzEb_uO-,f mJ(A4+<5!v|&w2h]̋iJ u7oyAS t韇R$ϥmʑjLfVݙ{Fm1e**ϼC=vv *[ȯ搱F~gן,k=hhG\Xe,o>QFC=LK'Tyllqî5Pod >z3li'*}Ϧ|W'&Yޕ0:Zq?^Wfrӓ𤬻r3+LϮ&r87/ ֓] Zh:@}ьSnڏ 9Q8@}oJFhOۖԶ$ @ƯFjpulɈE.d( 17sh)JLiJ9S,gt@1`O~jhO=}!_ 4PfmBKD+b !] uG97L),ct!5zQZ T]%Õ JhR3%!ԙANRgf* SL:#;;)ɨ CDɳ5QL)UT\*j&Fխ/Cde\X'%؅M\f+m{|]G`X- @(4eQj GCg7Ҹ_gִiq޾D]FRx*պNyByPP7}:HŚpvkJz@j_`*  RY'(pEHH(Y`RB!G /(]Li)TThN0) ) 2N(]Ix|c=6 y~YJ!x82 ±2pí ^ 4Nl9Ϲ, ׷/I?{WɍJAO30y򢑼ZU}#tPU%UfyI%gg 2 ,G9ZAW;[WJ̊VC4Qvk"<c$+i I H0,Yn ! ͊z>qZ_s@ ;p۴f9$( (K! q껃G|ƒ{EPVc`J՜BIE@냡X ŐgI%T u9R"oOs gߖm!c |Y7SKV#N`yHM%D "s0F -\v*4qu-H$VzcCLLVR<6!9oY&Bzk.!# U"DsLC qN?Z]o6'Pl rzׯzuɺL>YK]Fjwj/ *AY"( ޷V'vn[]D'I3m_t)Z@}5Gyףđ8uNiB!Hu)J)h3(M,һ]'ϥ@ H?z3BaA['>a#:G~NdLÏ㗬 o(Ȼ`ˁ=L~֎ @އq3O7;F`|^UJd6\^ѳMݕpr\_^?~~&lΟ@hNk-7[22fUEK-Wo/`o{LǝeㆵK OWlxw+}>LuvN A_X5*؈헏Y74^ Xks[7hdɒ|'On~~ ]-z5?_PQbc\w=vJ"]*fp߬nD>0b=∢N7wvÛG 7w5/g1f1^.性ݣn!Z.=u.βCF51#{po_5-Πya&XvknKţKj\zί˺{^QXgwr *Wc"} ,ZzQL3 ~e;v>fyse<{DX'[T%ywKeo ww,ow?< O|i d^4y0!&#Fdzݏv5`[{?%

ƽ ģ0PvCL{-D&L UQQw_2o&B NK}tLgU *KN#G9f{2\㮯NQyh=]P` RWxߪǷn{%AC%MeWi|P9~{(dYkԭyKLrϦ9tx)i2Mr2<=DaD\r ]3\xicZ$O.x ;x0Q`lPqoB%ܑ*0ì W^r)0#l8Ϙ/4nBh(AWVa/{,9ֺ7;ӊ\q]aU ATg$ j[e kaIW*JX+uUTlD!rĴXZ!) l9HBE LlP>J3ZaPzG&cmm%1!#* 0nI'ZO[(ʸ[4' h4K^X#S/yUJ/>7QR4_zA_?)?_q*Cpp B: D́R D@XnNm0ݪVy\ V_U(؛ f(MBFrHS8s]\y* Zy(6$ f?w*4Xf8xcL3ւ6ĝW8B0ƈbi0s箽F1XM`f2.Ò>QcŴ9 2 --f!xE#Ql0%jN@1PW30dp JAЁ`7lo?<e)JωSĘy3\71``ns(j7yﺦ~Ѩc$c#^z"A:m#nvJN*z6c)u[u'ӡ{9y/3/o415yK]m5I&uL˭!;@7֎S䃡Hڈ{NBzZ7 ivEMB&q#yLF|9I6`ҲsdVI*[εy-;2;yR:31[4'Ue;~g1׾7|cq1~]師Sͻ'͉{o7 }vL {frM ^E^$aɍ&[Utd MyN}c([c6GlArrtbwJ;GF,u΁P@/|s{E,O̓CLkF\sn:d"zǶ(8ɛ =rp3PZ[- ~qwH_m9 2ݥm.ԪkmhD(q#:v ,j&BG-qh5ǏZ#̞\'2NcJ#uIk}Y8o=jIn7{ T_*hq'TUN(8KALiKQQP qWPrN#.ׇ`r)%+ԡbMhzɯ~_ڲSWw1%ky5ڕ-):/R@̃*8IPkx~LtwfhˆRQh4"p_)і#RJ}AWKK1&$0Txvp_\ >tCPj&_uؖ".\U^cT 5?*1$P^c.2^*ZRLA9id] ,DF264H%BKNJGt&AFm3LJyrƺ*qd܉q%|s_AH^-/N(FCs܅}/ = &H~S2 [ O-##i'=z I3m5NǏZ0ɟCQDQ'wXB|}o՗~@<"~j<29% ZGp~T88 gnCdc#=ᡗ_ET.7GB޸&磯uc$cn4onկ9U7<Ѧ-qM)F_mǺ&MbDtRƺ WT%Ou!!o\DSd 5z֍QXT N7XjLgݒ'Ժ吐7.I2UMk]릣CZD N7ZYj֜F!έ[D[ E4I4}oǺix[*MD'mpܚӨv-yM[ y"$S >+`׺ &`R1h":hc݆92=nmjrH>{hۺ([*MD'mpa5 fݒ'Ժ吐7.I2kuSr;T N7X3G Xfݒ'Ժ吐7.I2ul73? QY2?3hL%Z`^K`VM&9F1.e%,& }D(fd죜@DS KJTɡ 0KQP2kFhj4a8/ 0SBhifjN*Y˩ O2ƙ*Y˫ ExX"k%CBqtZVM;Y+ `dMH)KdDֲjBdz5x.Y˩ P2%S@\dbQZVMdz5 K(Y˩ B >Y+ Q)*Y˪ Xh#TZF`1gVvE=梖[1]qyV"k95ԐE8Y+@[c Y{ 5擋qKdDrhhr5t5Kd-D;lr5Ω%V"kY5):Y+ 0q爔Zvlހ5.%V"kY5A F%V"k44)N粝y®Qb/f׏xb6X+S{G̼B ͙jx{ǖ.JUU%=7+#E'N }QZZ"}T 0|RO%@8y*TGJ~?^]]?˚z? ryж_OK @zUA?,N>T ~)-ߧZ=cCRUV[OJ 9vTYqp8S`!m!}'({7LB0^w鰦FV+ B 6xldLrEeA>Wasea "F=5Rp4dtE!ԯ@ 92s%W9=rb<€r8@4a(~W~v1E羪݃3Yze@s >B+!<6gj|aN>} /Y6yݷ} p`TW ?`<1Je \5Us-ufQcNA&Ui~VL!`h[IdϥJpkl(uFi%!ٮ;B NI+6]3RIQ렜6` Pn޵q$e/\v~?-.uaS‡pU)jDQ Cq,ætWj^s Qz&,?2cwuy3"~ͺqQ^wd`9ɿo@NW}J`hɻ F̉dd|32j<9;=Dv3%: 7J>^6v7+ /x 8%wA=_cC%o~+ȻkZ|Xn1K-h`#ӜS ;b('Qƒ3@v~γ#&dNv:XpŖ Q݌'LS*n5H|a'u 6}HbxЁ!A*j +qA!)Xhc^K4Zro5Pl#?{U-WS8VKv2դ *|4cK1O7h]<titlgA>X9Ŀ+h?'j+ÓZK$ۜ%=*s+J]񹃏P0qXFbHS}D#- :cg-;XTt2 lЁܛn`IA::d4 :Ņ:^Fɯk̜azocM^!& ?6s4yed?SBZlA5`>v Kͤg`3*h/Ma_̗qrZzw_dtpu)&^f` U&>!ظU ~9 ]/ݮNA`4+f5(ٙ.i M-~Mh8W~E(;X90rǓՇ~IH첩;Na"J[H<9f (51Jre''UeߤT6 Wɧgj)?B7";X`y(V˪ц%xY%z-esAYz"V;~GdFh)E#u륐5^RK% ƬĬ\%/-<+Ǒx *c[&:jdmJE}Dlj ƥK^:ʷHCK k'Uvgdݙ7!\…qᒩ8%UViu [27x1 %Mq/YvmI0wx67"`2X\c(eK0]|;jJUgCp)}WJ=1KNr!V! QNҸ8{:U68N"s&x4Ƨ_(88z%6.  &"`Ԙ:tcSսJ9Ǵ/bggVU7,7>ny&]I|1*+(T=:8[l8غЛBIH&6Zl:R!D[ lHUтB4i]jUdnv&E$F+(,Q)4m9XzZ@J8b<$ӊF+Z^5/]j\WUPlbpTxgdB+Zϧ`BV9)'v:Rː xAb 5yH@|J%\U+,$B-xwrsxhRݴWy'y] ?͹*q/vxsIQqBGe9?REH~O1ξG'khkQD;LK;]~ϻe||Oh㧷*͞ЧUy޽ᗋwV7y*ߩA EP ^oҬaz @KoYE";H"aU:K՚:E5;g9r8{vKv#l]D"bD%RGX_( )IX^#,A!D괇@EN8.cV("#vϗ>j u^c Hq57m5A :ͨGԣRTkgX:(QƑG--g6,"U2F)H%(u6Ҍ#Li8;i.a0eP>C&}ROdI=5J࿭Lq#EBk=`AD4Tbl8AKoTyV.PTs S=?[>JS[YbmԶb1 c+^gGWG""x18N$&ԙh]8o6(¹ \Hk R%)=B? Gu4K؂-|m ~Ux]觎1ݴѫ S8LQ˼A1DaKP\9 T2T_mAY٥}n[=29sk6t@͉k}qܬ31,;9uf$"\y$qӗ$`Fj1g{)F0JPؑk74T͕Ƭ/gMi&pÜU,ft/>/AAl eF0_ /ގ%:},dD"j'r:7T)xi?{ i 7} 9C*]a;<[+&!Ir\,ˆfi#eaaLE >(e`hxGʹӜo~U kIǗp<#Ӡ&<v9&
0IzHESZf7f8JR s3/fx o(֫䓟OI]5],h -EeI Q=%1<:QK!Ö⢱F0&JM)SII $JnFQ)"6*x#~zD)OpR2 ՑƖlGRzm#fE°}$A;Hդ0cHĔT6N-BT1(%C 9byxYHTRhN-Pa.eόC}<Ā1=ivugA(tލtUGNr~xsw H("/tADbLfߣtV/O4`ޙzz7i}g%|r{DHd=3߅rURnyv+p7#WC/qI xC$Aw]l9d#_֌dbU{$-T,EVٳgWPd# #>pn[ @gk[kYFM0d"s/93^1ŐÛ i}4mO^B4z zf設oXe+jкcî{ !gw}\kt[}!p|Ig |b=+g ٹ=݌!SX^+a.$9Llc*KJE YvQ4NPQdUl,=&;[E>[E@0vIFRh2:7Z4Cvv' 6I c>؏g Z~nGPQX=8~!/ baR_RNӱe1ٶAQ*G ȾWԔd֩|UT@>TH;)-FDɇ+=(0BJ9XtʶsOgٰdvș|Ey!j{3sB X)9_Npn+(}˜07#P|K1sen  ΥfwLXo/gh˻FEntXFEntXted3&;@KZZ}FZxhSDBȁ'hZg&S]NT]Nd&E MyXWVSrtsU)6s 䩰SDJZiۀ޵gKÉB;MٷTj])! $ i0{rr?IKBZL2XjDn&V6*~`IJ~20<6oKԹ "{D\Q>bv X5ܶǤ4>"K-Vo\ūR>bN±g5 d?? Jf!I#R({7RX\R-";ن틑aVRSHּAD0']BAfL@%F j-w1BYgB 5H㖵VBTEr"bzf/taVst2%dŋݑm|IUxk2!=^1]DHhpW1?0&[䰢>5q`g׏^\_]-geJm+1z,? WyW.j,{]LFeQuwG!Y@b;>v5@G7)dn2l%,{!9{'G}x>`̰S'(y2!$6:.I9!ecYIQ a{dF5vZ U9mlmNƭ$$Hё%/_)1ֵ$S/zP m-}^ f%ȴRӨ7bh|o7Xδ-i[dӶȦm5mbQ^z/l 4 h5 i+(:|:E(.Xv @3M)q|%,_~ɢR'R-1fu(N>R볕ֻ$Ձ(4;t b3mg AӨ1BчLƔmq-r΂K}WTlZāX;c%pt_-=ZU/-"KF^CaҢ[YvcLkZ="]xh T%wh˷ύHעpٷ2SrIasZꤛd%'4Gsߖ/=I'ǾzrHR+(% $YXv?X.8[R0cGp~yĦQGfUȖ&J~t;gGeXm]āyg~tuQS9N?2NTxӡh;4âx2"T'{@C4ub;N iǜ =8'z0|IK l!v.1J^JJ*l5ΪU-MZA0FF}p⏭,R$>RR<~ˌ&TtYKNi8dL6bm=X֒Y_%M=s߆.e~ ]@9:W!UN8EdwkPmF:{,Iߏ<W;2r/8d }MYVNfKT9s]J|S⋜_B[H yȯ ,iSA12cQKMUŽеUBNd E`6I#P KQ, ߄r8ٳ۫, a|iH`qxm7jyNKE_0yb$Kad\@rVI֐a.E{+a6{$ hۄbڐfV{voc$2L:YBMMӊ4fsd8Vl4Ėp1ͮy_0wٽX`|5{v5kMΞ=6brE4̀RlDtJzlրT4КlCNCLzzL [~Fy)lTi :Nlx[ln&l^Z׈M"oB#Q ag9BVa#e͡,* eNhE64 ! S;2 @Fzm-[57M2N+1wg|C:7 3ĽvN&wغO'?]9%UL%9SXhIGT#C :k+iݑZюZ£\`7W^%`6f65 QϚeww쇋!ssqßILR<-HE>BP*;6HCz{B#WE4_FGvy? ꗓj(Cb:{zUpOOeBrrdHMg peG/('lo}nG|vy2jEpuG0}9#&F%ѧ'$˜bpTL,IBgҽ'|9׀Jy˞vs?G/>;"d)sb1_}v~_o֢tPᾚ {rڸQŨ*?o]HschݍP:W/:/BX0{0"/8}B޿YtZYL㛫esguɃY?=,(߬ÇDt^ȏmśfE,}a~,]ߟ|eu`~k,\o,D%'Z:IC AT`\ \ԙ *ṡ1ҡ俞ZBoo7_\0W;_]忶fٿ>v.GPtB|D]A&Ig%y%KKj(BGt2gv7s]%t`d_¯ìZCyj*q(GyuM\^+G1i:cdqةSo!6653B6Ĭ͂Gcl f9]Ug_~v y7ό)tTwMc_o[CZ9Sw;-~jh=={9yhhiᮥo.=*J3WSf O$>OM;yACՉ^<<##Do=ᴤ š8CS: 't܌r8/c"fUcq[ƆW SsuGAL'>2&|7I;_d_d_d_tm (I8 V(5JHM`tg[kH<;)0_dږ?Q9Qݓ?Y֍MqS/iJo‹ٵ Db/Yn֪뻿s'G33ov[­΍UTvU _sֽhH$4}J)u~/k{n۸ /m&]j/goCt&3ݾ$,ŚhdgAI$ /B8~4[$qUDD󎃆L( -O@JiB+yBMrt<^`A[:XeN5ӮIFi¥i4Zk>c2aQN!Q SU"PM8$ЬP)4%_“ߏw 1~`b(^:b]bA?[+0H0B_p_>z7A BI9U-E}7"Eƪ tvt[e/0hQ4uoõAa /Ds1^5P@v ׳~*kJQFԓϡ5G^=ErMoޒv'D)\nY(Rn.!-V_BZJX[:T[h{B0p+n ޯs UpyXq 5UQo=mW|QCb52Thʛ@B_ߜyWRo0bJ~gSkUXNCy\{&yVcZIeA7 Q* Ԥ~IIl 'g rrfr痋|sdW] r)*K %1APyR9-Ee[cޅ߬zм Ț5d˦3y310V-&gn)-z*t}Bg \_o~|h]NI@R F@;'8g s d9kMΘ>0ݴ//ftY,؍-o@D{k=8Ki%&MמZ6 D?rP%̵͜&WHbNS'ZQ™8ɤdSJjKNb&Wpd2:3ԔdASθw'MyڊK j EÝO* cpY:iRV` |!ycb,LTȦ4PAʪ'#G3c7Qmu-mW}2R>SZIN -ʋh6W9b(#:qs$8JY!ѩ$f8n;N`!,%A_m(`Y̔#R$Mr ݺ׍Z1"&麦$% 奟DJP$Ii">=6i#zM,{Rj2m.Eu'Їx.xR nzv:~[9ZnJP#Uuv@ 8xw1 DBG&}pV!%`  I2i\d-֪9v j{LKږXT\4medSF[?y9ܼ5y; v-(&qÊ2ܴ_&f {'t2YޘޚpGєQ &EI._;ɧMv]z׾󪙥U M}>钏wga|SND=vEruι~ yi<>+U5Vj>yAY - éxw$eK˽JYxlPOj }f+;/E@mVy}Ռ^."rpP1n ~^{6^A-Oro3B:; lh.e.X/P.S\-:J~_^ڴǪ>].(ҿD_MBK<u1Ƀ:AƘL#<"cɿ-b{}]Vs)V2szIP+pH8/9Fe;X4bRH r]~NPLMꋭ4K.M1()ɴM΄8 2eRpz/DUǶȓO>2zW<gxG>.{$$ !yPIҏ#e 9%N'Tۊ_Y U~u`ڈg\״Z tOR;61(3ٚiQmC8ӷnSJocSr2/7i08F-퍳i37U[찹Qhhhm\aewe>wf\[+c~Հ܍b\!9<^j/jZ˹W[htbГb5Pp[gݪd/k4Oa(+3k9OqI8+P K',qtl\B=zR81Y(G|yN3'eԧ2>NMg "2>]3`NK|`{mI2>3xB4qI3n C iЩX%d?}( ۬hӇXgE[LWv_FNkG[#02̣q$@ʍҡʘ9y4RoE{!^^E|UP+_ iM?oNoh6hg~n 4HA4 R1(Lne{;'F4&PJՕ3 S[ؘ@=z4#:}yѸ*L8 (? EK\ R LDƆgJk#@8E @"vȩYɢ솜G%>$)*$0c!&B{K I9͘DC8)'1e:K&{{nBRmۏז`g4-24&=kmHe`φt/$ zbH; T2 CQ2#aOWu鮪>sO*=Ns\L(..T |nlt= .V Nix8h1ܧ= ,5-.N۫J0ʞ:][BHZ%aƞ2!EkGߎr`NJ!E2`+Ŗ`ǂU[ єy.G1-035pK1n:8g{ K,N#b~yTptކiKbT­ZG*̵CIA"Ÿ"B"173L_TX-UIi4 )JqpXAQ*I!h^V-8O,2u$r\^HK 0۔F<5. 5P#vRZ)sa/L\i0v8TCXk`1hXh.un0.䨸x $6[Cυu)cs `V qQ09B>AQ l Zט7F}A* hp2pxc`#gA7R*d1nJ:ӦV/a%ށ HxZak8 .x M@@#knoY&?]61b1`t'_˲AO'ng)2|rKcP\Y~1Fx _}e]4}䗏;IL<͇C`oXƠqֿNn.. Ap,@ dhFV@#&9$"E I FF0T$lLHK$N` H3m`3ib!BnϿ i "م@5eaiٖtTAHW4,XVaqj#p۲1m(A#p×,zg4c)^(b8D*cPp%)("5-7<ݒ%c "b?&E'P{/B(Njwh_E] f%Oj AX7VB"(Tq?]cQR!jp۲1)8Q1i@䊫&w1~"n lL"Ws E b [h:^FJ+ꢨ'\ \lL aJ?Aw`AǙsYH"߷T0(U\RVaʕu>M81x`6-ۻUCFQ#^i 2^T( 3Y@'ܒT5437^BM../j?6)h(+֡`} DK JRՒ  "B񣻧ЩXi(QNȐ\ QCyowȀ|rߙ=1oj8LiJ)89c-CHے1 %P"ZT Wpa5 @ C7>!n׈Cv7Hq F" R)ߪX1*TK֖)Tmàa-Suh'!_?hƄHɺ7uOV""!GpM5i0XM,91(X4Y:oƣ.:uKSȀݺ5ֿZdްa MU+bP YE'1F,p;Xh1~eǧ.1/ZQU&JĄ7d%˿.;#/&OCB2cfSbBQ;6 `1.090i;v"{?0%ZHf׃),u4o3L!~i[)uh=ۊߛy^yc5&c"$PRc~ Y)$%OQULSA ?WZqضNΖ"-볥)!I[:wؾ5^ַ%Iܪylb糼 dz3rɿaAhi麧+?˖HN#V,zzḑ ϲGRJr^KU XĩBs\9Rd Α#z t/Gyr*"&$-&lp9dA(ëdcjRCf` u0AyS.`g]HN(\)暐@"EB`GYPt_Oɓr5P =0v0?]o$3IKenB 3V;&4]k%I'%VOŁ3>h w-YȊ= `u;UZ aqfmş4c0{ e3Y zi/8y?`fjQʰzHĭ2=vji9W(`J5J5dj5.8>M{ _M|kqSn|& L H57)8 L*f1DqEt=,@Rc]^-V}|/P\`&vL ~xNKwv gqGf%#~oI3NS5 ,. =yS*@A+cWLMd|xeB9W{f!BPc ج9$?ꓬ_Z~ l.I/^&A$AZzLARn*n"B-D:M.[ܕSIl[x)"Ķ}g]vJ.I @G,Mpꝺgv1鶮>>۲cQuK&adC4N$)t|9KZP$InbD^lBxҀc*rbJKH; i Xo4ZV3!e 4[ kV YAx(Ca31j[+K%9u"\bB2Jkb w9c)\@L_=V n'A"SMi#R*(߮UG: թ]J~cRƥXn+uRW& >LBUyXYHZ˂Ũ܅ZQe*#ܓ󇪓 5m_F7Pɮ7/)'"ˋ-$'د9A3"cѼB`x68dyhzW#Ymo BѪ fK'u|ͽ_@4@R Q6҂z-vDm7~6FkOD:٬CN26v4ۅl2/ Qp0[3`tb+F_hI7~U‡D"9{ߑ.I%AxsG1Z<$>32:fYچi:N7s1!ݎYJR( 2_ǻ5@qg&Sht%M6DxyMQXCk^̸Kz7L}L?noT'\2Kki4V)yؾ$ e4O{.2L6d>2 /\:X6IC*guQc 뻥~۔֊.wAHnTjky >Ξ\eZnrnu9ꭘ1>RwNu[~]s|8t'qW4eٮGЍ%l]doȚ[gi7b"Y: #VYoe#CA p`6vakkrFEk/SgTv.;}9\ zk4]"r5fz%*f&g{yĤFXG 䪾yDZ]f`9RE϶|!)9o\)⫈ΖBY2$G!j keM$|7lYx Zӂ9ƺ"5LRW{Fuݛ`#.%TvNAf.PD(u0# By=\2|\*؞4Q 9I"mF^XlM+(pVzzPVX@]mqJhmĝ  &LVbm^\Bs>nۛyeԀWt@P=9Wi4p Km'h#Ѕ1F$ntF 4h~_[Sr$iZF󲾗G')Hjˍ_rL_mZHZ4׸ sgT*)k'6s mpaNGF|sܭ:TIlT+EVhkU*FΣw5ŧ'q`Tv-|;V.@&CyDy 67EiWIh^GC?H?AA/4Z:* vxr(Z`;A( dBH45q6 I D3Aa} ]׆A^sыՠZsZ.ʤ ª%K]BB0o$#ZIo'PB躋ƟSăCY4bWZkNZ7N~⚔%ZXi)ԭ(VZJh!邚%D(܆edJ5.\+Dkwq8M3 045g$26 el| JS!f̧IW @6Y<#/yj5S9D m酽GV<]U߇G&EȞ>΋eoQ9ǒ#cC<ؗ<6gB|7Y%Ս/$0i5+[b *Dq^KQJA.P3ՊxSNh%&4w+͒*٢Yz#ڔ$ez~'#˖XODDbOWjmC";qaZP"-niiϋm]'1OZ=ׅVE龃,|+ff7Icf\YB RS4]`!d`jf86$LJ(y4pPQȹb^p8f?'; ]~XoW_j4]_9mz)|{Cgm1j2VkĈpʌ6 IJoIB2K$1͌a RŠ>(ZS8-`/J]8q61`Aocwp\ӽt{H|՗o@Ja[[ yz%K5͛qb_2> PDDeˏ^6LgKG#nw`Sӷ{w7UH (Jpu72?(H8'R0(0@~ x Maj1G%W\考̢{4Klwo>a]sPIpaˬ4L>$SY.e$HQ86RXt)2Y |<ʼnM$w)[<PP+T\-5=- OmfYxa.FW6xڡv us1N}IlI8Bb'pQ j,E+"O;Oe@,,#vfjWvnc q܍Iދ=}tԯS/Q8e~:7Ǣơa&%i(Ҭ[wO3Knm=Exul 2*WѩxCխWK)H3eq$Q رIZk'>w-zB,64 be$:TȈA NTJ Cz%`ZD 6)%p{?x:Y -rݽ ()#2 a$LǩP`'D KfTL'`#qw__S{@P,~yzXTʘl-ޚY'3[kP I1.f jfkm }3c0,jy^_)cn@S#f¡`u ͣcbBȦfP$#B 1i"3l 1jdYCk=vӵ;Πʣ48 +2b+#$ _\x.MlL5R`=8֓A11*JglٜP-DPWK1ݜrW0xWOzդyfaϧrd왽Z\`F~}}fK%wfl>-fn{oXЦ ̎ =g,y옛eVuF v;{.赣~`aNwu5+6XT?>W;/NvY5>EoE(g;Xx&!G!Y!aL;q0:&ߏy͉}RR Ia Xܧ3!.^ ZSgYT9b"dH@1p6dJ&+ozFČ{QfIpD:HX* '& "v07 f =x:'Ptl7P`ay!2r -Bo`] o҅^^y}7t_x>Q#u~tjѩǢ.H&P ?&pPL\ʘgq"la⤄J3-ƥ40ӑ"Ix͇k'&<.];DvtsGb66 /:}n{ôrvHh\Y|nIǞ/q{PrF֠W^M?Z3~|,V< <{r޹|XFs(~="y=Ol-?0'{;f7}'||0?0f; ߾^ ҃_s8p\).D{DuD.-EL,є &0J_49CDdf!w3wkZ#!0/FuVvs Si9mf Q>ān4#$ W,>OoyeW3}H1bV3I*ѬC^܏FR [& F@iu.X~04[zgQT,cՄ/},ϮGgDDeX5K7?""N(jSn2#FG\abBXM>4 >K|#9i0l!-z0C:^d{qbq O.6=O.8l2bM(Dڥ ʶo}lou-ӝq6r,pZ A ٜGRrI8"sܴE~h8FÈ!m2uȠ:KsSHRM0!cӒS 8%J]';[-^';/ٟV&y[)p h'ݚ',GJX(U2,ż9ooO_ zWn8sp۾ޙ ֥-۾8BwWEuo;vPLPƌi_Z%a8S0͕OK-BwF`p!iKvXv$b\7d5t>8Á2P;^NCl$Myʘ6,$2F X180MpFbÛdʋ`W0EM24X0rǍklgIoѣE>.FMyhQLjQﶛ': Y/c]^<~A;jG*Ǥ2z_ A8y_ARBzhSbL􆥄R2%Cyv5[Q8TCw,HSN0WgJt떽(-k7fD LDeʻ"/UHcg ˊyށ"RőK,)F1-MYje{mBG;/*Y9BW#8b"hwgQ rSl)A|(Ξ>Ξ(jQl쩑yf>-gWgOٳ@oc15 *3k7A"P~][oG+^,_ !A3;F>fqF$ĻSMRjMݤbڂIͮ\ꜪsiN ٕŻ< +=]]tILA< b<wz*e:\vܥmG_TL#.pqw_5 X!o!"FٔJ !"x3*maʄTkE 8V*0Xf\SmёXfk+)j0]jP:Xw`^EUVF?A8PN$*b,m]>mpc VeqО *Do3(s!9i\X$B؃rTXF ڸ,$T_xNhZ;'xNJ iEPA-R^;00g Xq#T3d28Acuf́Φ!#&URK]orBqĦm$R`TӀQ1%sguz$Bp,5PI?Hl-A c)ۺwFeW3TqzŚøb/ Tcޓ#Qdk ] \ T+JJMa *o{Æ'y p`yom;) HƾyLi{|:?cZd'RbܮOWafOh kƒ58(GNAoZ:~тPqc\c[N&qi"Rvh Jڂ@<"X+4a0rDh\ARh9Εh$6ګ4Ƕތ!%cU.Һs3FR,b1Vb1`=nPGJvm>B"x;0 Dc#ir*r?ᶰzM. L~qm?);k2iׄn]8h,jUo_B'm O|l8+#? `Wp6܂/jLmc7d}) J; o X ;/J%HFV  pV\@vZ\rodYNٻёalb0 si)6X` Liz?~1!=a E{XI*7,ȅA9q!Mfi0W{J:ւ};=EY-3YqtB"m:O_MOh??.l9Oޮ??j웶&=hաG{V|_Gxxjq,Zm(w*O%>.EfPE(Vei3ڜeUg);?`B m3f\3[^i~w9ߞ@:[Pn/Yqͦ5Һy3L9K4'يw#3%Eq}T}[&[xO'9auZR#G,;@-ʁ@/R6.۽x7^\dCG! "bB*`1 ^ 5u,q(\c:ry,G7fݶ̺znZO\w{~cԔ;s"c >ӍGb+ALY(aPily`2,BEʙD{Fᅵb`. ض9'̹{ጙrJ*=]O_Y+Bc, ܪ1/̦wՍ`UnLʫ*.1&L9s {ul2~՛-xϸ]c97nԜ/NGviZ!"O3(?{f̭,5N*A>ZX{SR'r!_9fTZM2T 8v urhNh!xԛv^hvkCr⩊qvSHWOr1Ha1h:g owݲE)IܭI7^^YK='";}w=x0zpoFW>Wן\ny҅jc0Eid!Y,qLըr4Ke0M~%3 {_c6!^?qS 9S''HI Nh dJSm,'RH2T ntaAGy>BGvHzr.z lƣ(%l%r$X'b{ n3f-ʆ$:;l_¾TUOKuxIg=̧{uPOeSNϿ׵n}C> p -A9%,lXzڞ;A3ֹk>O`G[$qr|wjOFXsͪ9{P47>hn̢gBZ>4l*Nq!8e=9ȫ|o>ps\Ny rvD;A+I~(b=jH:( #Ա 3jNbNbAgWɝl՞MY3}; ֠=kLYdKb;F|J -W$!_9fd9fV:njvr4Ҍr4 X朳aL}Ͱ7wMQ( Ì/P`S{rL%A,Q*=Y^Z.Z9><+]FOv쨷xHrbQC$-^T{ΐXnn݇]Owe“-,[ȒEJXqi;$˓Kv RpT /Q._B I*NuCH)UUKyOg_TP(O8VUZ;'-O&گgkdTɍیpSB:.a/2by>05}X+Cxi\eçnlƴv}۬x\VP q$|lz v]9#l-qHP lL0=P%THLWc3 ^)鏰\ghI3ƅ[3bJvۅKR =$Xo¼/엒(ygd#*^633&9 2E,~~0۾)5j=DvwѯgC_xu_9!%I’sQ3UmFޥIMݪ;sV[p:;`TS2Fm7- w*|D[ϓt6'X-9\ (ny\Y}p z?^w G g-3·kqDW_&x  VE2-a;w:8c@yTbΚQ&b~!^z㼢(cpPv0_%<-Ea%q62'D9i7r!!jHkp |@ob|xԛ_ (O]ݡ+P+9>~5M8M59Jr2 ;|AtsU9RCPИ`nAb\ċ@@d F&1ߖ XmmN ke!-BrIe@4\('1 Ft)둡S5Gbl B)Hbbkӏix;+9JܽtX3u}S zV58o{7Qm'TLbތH+q<'X>; JFN[φU5Ɇo /5EZJ^.ٻ,q"y!pSuh27P V^4FZ'd*T~N?r̷N#s'dAl釸<,To$P(@RNP3=z|hO=͊4+ӶmK3C1B=Ʉ@" i#%Qz.:R\Q4n4S`Sfi^|Pb{Kz qy@2p^+~fGU"e בH''IQEM0h52@M:$i_"=ʤUbT?KZBFYHF9Zt"ݴxC"rɥp F Q3N#Gu#Oz:)%S*GNcYXM 12hHJ]DP0@cK4Wq 6pTXpDNI4 12z55HPM׼ 7NL%Ɠ$YIh-d2j4ZN>V2l#: GDZ{ Rs N=/Dyu *_5U7é5L 5\a_5Ru}8y9 2d9-N7`@2, 8C>RT;xOBos%SamcJ'BkjĘ!WƋއ"I RJm){4DF`8&.g'[xr>D"n3!DS"D7>둍λeW $㻾xT?ҭBԫI_ɾӾ'@lJDRPJ/ӊG!eEK)*pZ%iw3BzGm(+@ }NdE} {<!J` 0|/?M&')?B鱎ڏ\pGYc%{lOLFlFvT] =v' @)=]`屑b`L]^Ui?,T^Sܒ.^S< -tٳW^=+ ~Ղr u7`#g쀓&辊ܩm.$g!Ye;K<# il8+[Gܫ$:7/8ZCV*1a*4_ |sھiћĕޡHWN +Yt11DqڍZEn|k R\~\T|G鎈ZB8,&4lO?m|gŕ1yMbE$fHGU:yE-*M"m,ڍv%(Sy)r;/ DqQMHA 'G;< x6y=~w!?A NpGƀiIuZV%jgՁ蛬O{V-G)!lA",)_ĻLq<` u}z-q Zϒ:0g'r$ *r4;25/ rdk&,wvR^@l13brF.9A}di3+p9 &)f0Dch;5ҮYR5b'\ qN56ǵ{$&1aDSvaV;xyѵÅK@N)YW_HΌ4-!ƱHZv%lӏ1Ny@޼r})4:{NƜ&|Fײ/1[OFߞ߽?}㣿f+,Ӟ7+,{.Fέ\\cDrT!vF^.op..0JS1s >:-#vUg=v:dOH&Gls%+=#xm̏[~9*apϭW0SMf m@$Z+'$E|26&P1ETft։1ˏ/")xʏӭ=)/ '詣B YQ"-C`H:#P*UICe,R67vG d*)Sd.IIh HMG^>hR{{6~s=i?9D`(؆'$K"}eVK ќ*.bMHAAJKn7D&S)\Vڮ) AgZ!TG'Qtp Cw$94rTJ"ĭ"ŰRb+)Bmї->nnw,]-d{U\N2ࢼt⨤C-C~CI6Y;ۗ4 yOo͵u?ɉ)<5'~s2xo=_ݝ>L\يvӔVKѩݣvcy*`ڭR Etk@1|0\~]AbA>zKemSDC~g1]EHO5B!0%1CFv R ur=*v#ʑKVT\I LOYRBR^>-tyrX'΍ЭρǬSaco7lƬ2-Ińq ^]j/#D ~NE0,$r.}uxZB {2SӝATBu=09G'uhSdr|yЅlMfU5S_msWθ(;D Z9p6YK.8ml" -(-@=BFE]$m ڸj>_@ @|ٸ1ߍJqϖ1Rѻo3h4q1NzP bȖPRXAD$I!I IMBN9ǝ>SԒ {NwWz2XmyR=` w p-c4 nZ±45ą6"h*Dh r7 <:; DJHzICͨT3vx 愱_\.Q⽞,:*$-HAi~W4si7wUHPr/C( =~`|;>$\)%=xYLދoߠ/n'ừ˲@s)aI5P,_2r+T9{dJIEtRj.h^kQ cʞ%~(t0`IKH%P*i<5,B͒nIuSҍ=F=-!-JGlzҥM8ZҦT;C܅M#w1UiGǟ\ p)够Q<֕>v;8v 2ٲTT>7#FzSԿ^ޟ wsߞv~z<嶞b ]ŭhY6\}zW7/ېH8w5$LU\} f@ F^M !ownAffuo_~Sdev +-0\R媚D>#_60j >V"-6?@xz O3RV;#uOd蕔ގAjv]O~t qŬu;}is]zteg )"8# )4q&ebņw̷ִ݆uZ6<jXco<1MQ=~Mc]J;G#i/ &2%[΢ؙ܍ДڻݑdڟK27&Ye*,Ц Ԍ>8LxvYהTEq2JCԍE}=HDQq㡟u5mk->{cX0gBq]0챠9\uǫG_XA^5b /ϗF$7]!X Df<Ś|XHzUPg|(|G/ku׆PNGI1r v %VOErUG1T5޴oZAcf!$St㎚ ֿ܇P_ղq,vЁ3jTz*lr9Úo?z(+_l{y%NM!kf"@8xWax|O iQٿصsa|<-|ϳ٦~< \Rm{>eAX0q(tJ枱_^.RPp7EoB.'{7U^M}{3_=pTDe3?ðcX8>x.r~=~tèGaDVvfo8KU稞ܫ]ZQD罾1_le r WzG~*ЌJO>B C3XSQ؈]jShN_m_QF"M7e S bh ڝkw:'?54cOq&;KJ7퟿ZkQ^ݸPZwYaNZ{]ZNkJ͡Rku5 ]鼸ZgjG򶪃t]_:shMОbX)uhV:SxlUաC﫮~>-SߩVÛ&/rw8,+ PC* eknX3P ;vv6``mU5]hcпP})c@wP9gս:Ŵ>piBCP+sj<԰ZHL%FAPN_p f$`$!Y_ JPsj8ࡆ_ %R%c9WHSi~0e0:5fg1?߯_gWr<4O? }pwcTn7 n r6nxx]0ĢkʃvBF'+BmD^{˷dIVyzwTԟbRk{Eo5xoDxI*huS" B0k/'RY%%$L"_l\r"r1BP3#G.bV'#P3bLc`5q`#1J t'ùY{d`FѤ fP3.䓹jL6ΜîfO6nMmöA9lTöe VA6AXn"H\ *\!Q*:`նm_ʲ}b f(8#̷gKS_!#M%7w@gAf 3q82=EwNdBM-Rrܪ&"X)!xT OUj/j۾|kqV(̛!h=zHЂV먠:>[KgSHecr7h.e|0Y[k> hDqTN`I`P ‚wCC` w^wӢDV^z7y~ֺ؀n\)w0 ZAZ|OT&4{_(Y?GyJʠ_W/?<0)x@_2^3+h>C[6q&DF#E(yV~gd'? ߹> \G[far-!<?s1A\XYS|d+Q)7E>ǵ.)O@c|S.$H%L#`qjEU4Lk8Bl`H7`G- <H@+O8hrc$t̔6+5:KZ(ZTy+QL~|!QDcCŤ)&YquL: زbR5F+S{965(55ѯHRa#+)~礠JE3u.竣]TdHSoS-j8+ /3<+O?P6@IxeVa,7T9 bFf˜4"^xA7F%^ŘKU@X4 18V$II_f- <,$dФ>Z2&p3N(3Ψ S;w?5Âщ?XC[1L!fc( $9CDJp--L4a63ݢח(yx󳍴|اwj5 pCY(FԙH}|}}[x_hB0wNp0ݖYء1o4i@pei'lN0Y-b槎ei)˙J!Ii|CMi|vzl}5'L#9(5F$,Zt&o^*GUy? dL`~-!8WWMaZֳnl}G [_|kڵ}"Ϝe~7܎?Zhg筱u9l{ʮ[¾}wp9OkR;[ s3OPjQK)IET!D U7 +FQr .t M–D !WLh0 ;fґ]7%Ka/'ȋm߄^*~,ęciP%5iHěmx18zw?<Z5J#͓C.d^ߣ ph/?^_;Q6s}pէg gz~￯DFztnxT֛asT[jrhVBzOh\_G31__ p AH~]&Len d+8ZjͅҖDc,.i/$( aQ>Y]siV5/^: nv5L毴Oë> H6kvc/1ks̭a>!/?˖ǀ?S=ys_9~8{ n2H_]6OߖW_ֿ~rgԠ5riG$xꫩ==^jO1N&f7u~A9:h 'ZltNyjE{!෌@Y\>g%"9Zimߎ1:_'uq y| 6Яr Uh˹{A[u x8pqA~)x>_tz6ݧX @ؠCV7XْA5Hg.+I87˦o/Vp1iA9OL{g[SXDBwq埩B2-T&H|x ͗nu2s3Hdx1~A{1fw78:D?~LV&,n= X6ԔbyZP85<k&ybj Yg8E]8qir.WFwc1ڰA >1P }RY4/Rcdˠ k:tҽb}ϨSrJm4nsz9v/UQ۾rlcPm]h!zib  ضm6Tq;MuҖ"4B2&j}GþsWu6wίf,AR- n=V:n Q?JFZi']>Hu0 ^ݭm 0w^K-]GYk;v 'x % P6]Vڞ1|g%-oQAZnȘ OPXk$X*87DYT!yGpXKZpAD&I!OT/$<^gBH2TV,2.)|{*( =n1.U{ \eu֣i@vEjQZ͝4*0Eg4jr|q #S8@QAlӃ{ܒ51fF: O7[͎#]8I}h#=HM2փgg+"$ImzU,V2.`2%;iq}aؠ մ|~Z&1?qo,>ã/~wWy?-uR0FZjki(CmIԞzyRO?+E'eaZJ7b։IIMOVhuosA}xYg^/{jEoU Ԭ8 hms4ӇoQz޻03ҨVH}4&0ĭ{2Bɻe-'JМB$)붐Vض Z^X bEu+ ҾֲGns0H&`ˀ&x# V2C/ϒoUT# L k;NI%zWK:Fy:i}V:#&1|ix4^,\C)Tqmfʒg ] sdh@qF(yOxz5^%VO3`d"A@L2hp`Q=3Ɔt5i@oF0X?h{qozROA`cC;C?)9କ5Ocl蝤;NGa~J1upUK׬޿dE\fD(juCLp {KFl른Ac޲["V)!s{ۤWn.7\ힻ]ۛ@{dK5O8 3Uze1^(91BH#$BX(e@ V ɤDnն:MV3Ȟ16_Gʄ'W:;X8XO`jiV DׁFk)hggpc H˜VUFWaVa6@/+ \b-.gWXW;qvyfGn luWZr!ŭ r)z'v̭[o8T.2\b DeQFx(A8-/+vK\9s'plM7ҸgS.gM1]SLS&Lag q$pf<OP1j_kW;!У$e~nin$>95;K;K;?| Y4nC%ؠHJjIV7B@g .H/&C?NVI4Q8-jm-_Oaqs` ap֞kϠ@CKD#g/<W{` lRaa{},~9yżʨF7τfTk#I]dCәgkߗ=Eڛfer(bFW4 tp͚>µ֋r2}CȍWw B_Q栫+F.aVyIɔ>|}{ߴ^ 0ں,BK'td+ rW_r!em 󪿖Y~])A ]2gN4Rdv(jz.g*( IJO8;%-GKƀҜ5Hw9|PÕ H:,jZ+r6KqHmDt/ozҪVxqM8^G>!x/lD1˾ Ka[A}.tշǿqǤm⃄?swvyUo֪khGӿ7z]uOOӧrl0x< u8[N5p 5!e]~M<~_? J/n/ϟ^GfŚۢ*5<+N7҅UiNy |p'̷ۑ-JtŊݝg!@)+6LELJQARU>{mEjb¢G\|v 1NS݈R 嫮7i̋bNIAÕ.PbZ܀nP]-:; {Lah^خ(~/_:=jm"JF ,in@NHȞ16O +"&1Br@3\ ӄn=Mg,P|< ZoBޠ2u=WROc<̉->=Ï"Y=J']);juu_9 r;+Yw9b"\I_WW9湑sM7\zww9zn:A 7Y̳(uig ( ݖqD5dNf!ģP5U ׽#cHfGj)P$p81SPcaiI/Hg6Q 4H~1/ qH3:l']h kUh1!0yeok|c+S"yC%ѬU9⠏I,w3wz P<_;(D:KtG5ѮSŷg IvPFZ1sʴ>5$Neϑ i Y(}vSpq:TekJ|]a05{*\^D~p<3aK$3Xq%`[4ŵjV;[ߥ`+轠XLbB *qK|A]HHő U$!c>K‹E#E+@s2ٖk‹:M7쵩 z[%0'3Uh|4d4C; .j_,cqgd/0U$W= ؈xPZvҤy6)PvOVtM[VN4('2-uS 5،E2K]2̅n,=B::u bd]Ҭ/Oj`mp0XeK ͭQNWN۽&$~1kyo8'w⌈5f^v}kZ]P>$ba3q(@R g@ƀTWے'ǽH|hiPA\çR@seuUʉ=W1j`5d.IhQ~RuڋTtI]9`Ѣu=0UѶmwTi֚9t}HoQtO&JItͥ(ǡYoZ?]tkiNak!ZYjkyI|^OvIeq;Ձm;K,IJ[fRi,P•geh dIX6.`DUr3j$ԔW֔pE "v3(^찶; |CG3Ie\xdwЄͥQ*& w*m? wZ8뵠|=D5Nn_Y2iS Oh*PrK+;MPkAv , JJyǔN00;eBLƹA?';KqE|q60JE4cR8Y f6\y."߳.a7f/׵}yupHwh_ gb+!8KV[k$8=uw4\K+(S ?q- эCUjx8Z3Ws'(VShPtMm\fJ`PM%p`F _jW;4G˽a3omE[[{kWo^ հ~{E4 ▃/P9P4Xѫdu0J($0s;2gDwЕfM xeCL` ƊTPD*Z'H.R0_)"""|>8 CCRv%Ф .c/7J""RI0u_;0f nIlQ'-CI0qJ,A^CM[ߩuz"5MM!5W"2U+G>@'0J=qN14`-Lw1yp*X/n?h3.]ɠ :w'q1"ٯըJ,p\CJ3 ڲjK2B4eCH!3.A2.:b 0!,R5E FJzdA+YY`RƘɇhGŜҸ ~4eŗ%%ԉXX0 UP@!1m2FIk` "Rhƣ JgNj5,U+4kK,sª@PQ@TAá rCqH4N"Vr1 m P>F5-юHэ%# ,Bs*&$2U"τj%)8LRe-,ȲQPk 0,M.|BpE9LĨFbI˔Y_i$W P 5zky $LjmpC1h31u]D'LGixO2ԯa޽Oqqrs|Z:fo|q⻀=?Z&drGޢA*AģQ"߶#KE-;M kχ8lo>}w+Aٕs9NQ0|,FC ?D(?txǙҏ-h|;?u7w /ӯ{{ͳ_mc]7?vlW3g{zϯ_swLq_pw7qIݳOoT+Ƨ/o/ɩ?:vNO!7Ŷ:`̓x0})C.n?/GG+3%&|5)d<׎˽Nube?Eby=/~dS,xb00Q;ǃp_/ҫ!.8y#ݟ㟈ۅ/N#wWʞ۸{WDwF ~az/>*ÎWoyqjxf8l/oO┕> 荽LwMoFv\F[m ~xp ώuswKvϯJ/ǥu θv-NOhMC_goߟB'?9dVolڽVu|?6:dq.4BE z3BwQR[4E:|蜢JZ*~R\hV{Jz_٪ZZ֩-uV>S)`N3ga "@s2͔.:܀JKO#όg-yN*OЁ%yU[u)eVlVlV\:DZ]PZSPlFi|]PThy3j2J}Z.er)xfgGf`:/ n$9mm̰eː۸뻍5oUhFytg'J/xsV17V8"͜ZV9̢=+l%.i;TddTNkQ␸) D !j ֐m ֐m  ʵ(ʖiU16jZKũ5j.mm,Y:=;a\X U!mJXo 64 #2׿i"Qy˩\zh•r}L1t܊*\R$Oy {&R R1q:/ew%zeWC΀U<0ll8զ,:0W13uEb"=Jrf"Z-`\ǣH)e>cIJᔇ:޽۸3yC]^R;k!oF)(O [p pW24fuT%9H<: dOFArMO$iTk*M~B5 BG I0dJU2tZ&(SHg=)ę"y 1LAX ZK DJ*+rXb %!ռeJ:bd ZR/ L ftJWx,d'Vg. L %`/oȂȬ{]E04Z!:B2uQ])S6*LkP83}(_'!KKT2TrJF.b ㎣U1Hʥ!)  KE0wb)Hxl8ӝu6c9˼a {'#bVxjH[T'FU| Tq@N. NS;~ZAHe%\| RǠ` ԟҰ(}(*PN\G8H0.Y@})#-GY~ 1monϕ{F5W7%ȯ~fgΎ;]_~k\ P#v{`s%sBZ0S4Ks#Ws Bf7}DE+hL۽u/E@H?wL#\N7oUoHz)$[L2"Dug6] (Fij/PжE\#ey<:Tmq:d%ڃ.h\7Ux\Į8}ؤTZThR&)v*k Ehix%͓Er<3455 37Y{caQ%C..Բx3 tprba,;?4'd, Xߣ Ep6ԲEKõJ/E]{ѧ5E[i %ed/' ԛsƟl. t]nw(I f&JOz7t`ԋ$Mpd\hIgr %qT(R)8 ^}pOD~㸘ƚ_#Sdv>.箼eZb3>Il|z6>l kx<>=xlگV^kN[ׯ8(oOa|G|ÔF^Qť ?_0e.˨F>Wɯea{jYsn*|KA`@)2Ts 2| |?Yo<< 2]@2*)hm0flO*0/U5g60-,L5 9h)hٳ2"WK#y_ TPgT p;N25ýuV"P]Gtu+~2"IO3ykOk9!3͕ WzV>\,2(|Ɉe-P(S@٨uz!(08R4ʄV-vkOnx纤u0ti:Db)KY.* V:kd!r x,0 kg%Uj/o8MS*z>mS)zo7E7W~,k-|G"{I&bcgIL NPfnMU%){P%}ܺ"|gփ׀{ ^]@ h,ًK "*7;zs`>%,PA[07b)ZqxSscfv fro.%<:Q ^BX$r)جY.}8seA&`:R6FAn`蓋 ujȚ e! TGl׎[u ({ْmHyJ>4~i#MbI4%Fb (!Ň+F!=)/c^dLGY2e5О(4햸vJZZI;&Ñhnf-cɔ3ccFw.KR&)TYfT1H[M#mi-F6Ҷm?Y-/F)ǟNׅ6&D{^hف*ޟ];J~/%w,mWJA{ 4ӳS1rع*qX(ىL3#NFYh4Aq%5kPivKim#miHF6Ҷsd . A9˂8y̪G@ULC*$Nlm)X']SyXMXG-1;5)53!l] y+: iE1^g0^w6fyI% 4I깐2ֳ ov;imlF6ʶQm& ۅV*EHIJh3=jecQO!k<3ր{̱8m_#okpk+eG?SGsn?nރW6.FNd4>D˲$O^':н2N#pF6e<#XrŹ[8[8.IIg0 'Q.C5 nWJA{u ×"w=Im!yJGSV"9r2< İTgu<`^~}ж1Mler(F6vׯյ1JRNmo]^') <&\ϋ0L(I9q $j9G91 ԭ8ǯnJzgJә^C lQZlNil2P0 8h*Q(guL~_ݚuԨ2F5*QAM~??7­\KKكH"$y9d"/ HLܣZ? xh%tPW8aI}WGBy;9;aJtPfY0Yg Š 9` \;彠R0G5^!uʴ4GMX [4a8&lф-Eh\geԚ, TefJ0,E,L&"!7"da=|.J^+Syjp+5Yd@n߽y;fa0cERJP>H:0Ի(e01&q,(yK\1:@0y̚vfέHh֭a?]u`<a;@RX]AEpߠ|󛟎~|wo*A ^)僿'(ի*>+[]tcSGHg-|?/Uh?tm#W|No ug^?tuy=(')ͨ1 v=^a0U_=U4RkR뛓UA`mWmo6[%&'>@tԎfn 2D]NYY$=u'Wu@  ˭;׮A,%5Z A@͙A'γ];[ɉOC &_i8FLK"^gYʝDVB&.1Fw1uLaSmTBwy}y:+DݥL XGR=Ff`K_N[lNgkX%&{i\1y`p/3pB9  2?}h}WR"-&G^g1%s퟿&= i{ԔJ%^xBJc.ͥ>PK8O"\BK \&-S:.>j|6:lo+7"g;#$թaZ6o|A8U[l6ߓ~UGx!u \k@~0%mZ̨ VRcjNKTrr(/?@nd_ 53JޱipCFem+5Aݴ`Q[Na<}Wi(c-kgYV#qpM/̻Vcf4Y^'*2p8Ycn:u&aIvPȢ~Vc%: =_{[9RAs3?~y_da۝v;:auT"pۃ63@mŽcv` q9O;ӄOkykO5z4e\]ϛ ]3ekRS/๘UxY)mU $;##gco@#0+HwdG8ƨ`!d*7АeTr9*(M,& SHJ{8ㄊd<0[Z3 \?&ytx3/">[@ __o\=5>N3$Z7!jv|%ve1W㿭 e(-I+N>I}Y8H҂x%5R%cdSӚ A{_w0ޡ̊-8]wqn0aU݌gٿ}@r^W?$M]#&+r1Egb[,r!0=!8AmT։m̘ u *3T`P6*v\ ;"e7b[X@ǶVl ;XPLzc~~2gK60 "}#ڇx.[>R)vX,JT@pey l2y )DGQ(|:*Vb%R(& R2dv (%%3z\[)˥)f` >@6|@L>vۻaj3tyidIPt<(8LQ>y90EA8ɈD2ʢl^i̱Wo56f0vqL IFk2zqYID-^ W.DBu ٔS d-ә$,A>3$!*qp7XC]kjW*z!s T0ŕT!ѧr)E)k\t7rcޅ\f,[㋦+i8r|!WzTȃچ';)5Y)}$79_V"BCL:qL1T+B j0҂(PTvj_×)߁/[k09gZ{:_!eag 2%+G&,)ʒR&1ͮӧ]]}Ij-]6zۦDM0Mt4jzks4XtEI wɪ()"Ziq> 㜱EM{g5c'EഝTʝ4'l.$>`:NNŽ\LǸ)g2`!n%r7[MZ IMa|/=K2-w2F\a ɾ1 EM[oJi\Q+{F{j9 ,Dr2C,ӊm+cY*mQ$ ")y߶XXcAE&'OMI(U O)ZDt1dUUU K("ժc; /r/X7-zbi;vNsg`9>* ˞ `gũo'{nFt@!Т){v j TSxD>,go7WzF`8SQ+E'K oZB; :#߇Rn[ ?|%o*M9tRR+):mC-+{mN 3﬙e1S"kTB쒈B\$<9RY^IAv`sk 8G:>|2 "e90#ٝzu|QU)f6@zXQt,B(4,Fqe)xz,~Ne06J봱A^ 5WQ)Dpy;/-lUIJkথ \k9!,έpXmauD8K,uAHkCW5YHXHs(dOds6jńT#_ψem}q¶`ث=VLS3{({PRsU Vd- K XuZKu;/uC)}[v^BP;J8[@=wpP+qAٔXu΄ɱGs^*IKN# e<32w7> Q;EvUb ӊ<(|#N]:A1nQvJK[g8XZ8"jH݂t8TT"sp:Sd0*5(waAG< @@|,Y0Ō{0'JJ 16w5fd!TjtAF@^Zn]q K ⦹rTp9nNN2%qr]tuwRg| rc+<@m] ]5 AG+hKK=L)hQA^$/>k{bD.[>XIEdW۠ʤQBy:MɃ4EC4 6Q@V l_1!AEP.N^a5MµS43Pp0T/6[r7ҽOΦ/9;8i3 [f(?(cPϏkA:C׻Sn{e$ssouY>f3& v])MKjBC)\.>[~LZZmvT@vBTYTRHZeߖz|b=D a3F0G0H@5FL*YBk16*%`45Lf&rCd <>VV%^WdǥHesfFPt j'h?pZy).I]%qm-:Q Fi]%yd"F PUkP(UAn)6Z&/Sn^5<tߗWJSɅ?^PB8s|u|WQ9$T ʝW}\ˊ;w҆EAZ|q<*ѫ^^;"wRy"u<]Z'&.xw OJ ꆨb 'C.X59g v er*co7orn8#tS3'vU ~&,D$39aJ%-nL~Zh O NL7SeǘpS:>H֜$bI홉{6s# 4cPpnSj.7)-F Pjīn)T$N]j|EG_MAq'H5$@ͺ`Ox#Ohޮ0Rҭp(L R/^׷EN߀OdRɾ$P)ː ' aXIQĎ沃ԫ o fec2NZ~y F"#g:bG@pl[Xjn&4/KoZ 9Xvʯ2Z`1^En'j`A5k:J pYjJ=FCd5r*"8~X cA%L޹.)E8wV_e$g/t7FIgj:D8|3 7V`S Rɽl,D=E#ƢQz&Ѧ7$Q*)%$)ɞuQ\6݌!b$ֳ4( B'碐gygpR8fnZ!r)\6sq8Хd)!Ktjm,jU"5emV+FNv.ƦV#I̯aPKV:a Vrq ?Kr, 6S >̤6 [)4mܞA l )>/ZO5f+2WEʍKj΢.*JX_MNJT(Ki$Qʓ0MgVVx,3s&G'Br'RbNμP;b !RT(y"]/419x5rvȠ`z%Nd+Sۉ :u"$2I!;!)2w$. Hzp|@EIlzA) ;v 6!){BRrk6)hX|]Y\n7i\mu認q}C.n^JoĘM C6QѻFI6ԸZQe6zDCI|ٸe88,'w% (HF%\q;JA# ;ԃFV}_j8~. ^K_[^œ]o"ᆝ(} sLX hg-ALI䩽ܾbz%mz/ؾūE^m\'^%moAۭ&nO_LQ;#yP/x,Lb_S.Le-no%VY4FQEh |/+ud\\`%>̋9fZz?ǵHJ91?6oɞ%qmIF MZl ퟒmjd\ZH)f;o\8D4C$ǘ^7@2DNœSkUϺ.O8/^~S˼~^oO{Sǽ נ9J8sٮ.xJ`z_o:ǻk:/_ ͭha;s0ucÆeM~ G\pȓwR珴rY\ԫ{8Ä_|Nif>~ A8E5? Fp>g|_P&;!~z6~JbFWKa5\O_?lrOJKݩvpT ^˶\/We$RbE41ARQc^|] {osytyOo Xg{0@ۖ~emKy=*&I ^ cWk-pVbLwk 9k[?t4v0M~;A)vd:nm{[⿸XH%(O~y}JXNiwl f7lit4*y^\ *jm{U(zCDFhxyo,7;ݿj0_WejͮeǢWe{"͋4>s}[>jvʓ?=|4-E!:#":X3\ ]ws|u@EQ^OI+l.*OaS69(u.mN"'3_KD]Y?/lKG:D+̗t`mʅ9.%ieZYiׯѪoyݍHDdU낁AQG>-ϏYaEc|[#+H^L!!yJ|ЂAȝ oyY N1UG6 fH[cqoҁ|3%N;J< k\' Q ͼQllSfOh $B$bt3(}IKQd- OW=6sER"-h]ƚh+[( RRb‹ *Fuvz, [_c\E^txteU5@!fC7޲ˬ[q!%C9zxMƬ[x,$|0Ly޻ގ4y)Nf6[?/orb6tr>w?OR>>>n]o=37ެi,\`/wycl䓾>MɽT>>iU'pFN+%,8DƍxMABsR(X<04zG# naʒ2^ 4^LF5o-qyjn ;UEeMfb:˱'ƫ3ܣ elъ @u߅8C=m&.{Q!ζu+(%WMD X̜ЦsOձ h^T?b&%),keDo,ށ$~ \Wϗ\T=%{#7*6@/0)n粼:+ ڂگ T_ig7LVozZB=AܺYͧ0ׁ /f%Hx B1|2iSx_~g_elj~3(xSϜZ!|:[huɍp郬{{'D'˻i'$}6@Xj$ 62KZwi_~Ӊ/´wp:Bҩ*zVZhdO++ 8C_ǟNd= G1 JɍR+=@`-9`.i , = ; ]ʕZ0CP7䞗& "n>Y=3Lk%*5p-*$[M@ 5`#%Ģ{\/TF}=8d8ɥI ţO_ܥ_ӯG l ;l^/h5J%W-< 6 oo*Z`v  Wazn'TZ"m=[b+J:d}_|:`dv[c߃ 9c3=mBz#7* UQ
<?DD :Cӹ )ڹ0{ۤ9;⍐!ui(. ,hd3\*)d9Ֆ7M #fk9` Y/z~,󖋖Zh 9^x $+%ڶ1ˆ-Ṿ"iz:-Ί&~Ǘ?),fbF%ofq9zmenfWAēgIL!EPW,na0W'Coⴸߴ1zwd= ^lօ/0}z!~͂jSj{&:`WY 7oED\'E/IvWXbʄ+Q5'JU+|r-WWs]:HRג4 "D"=B-n,dgQكA!خsmPRΐ{D1Ύ^^6,¶)96G\LG']zמ>$.BxwM~,-͋1,R=a3ܱzPTo żvd_++xs1}{q^<[k x_=xu PD e_1 t!:K%.eУz\J :o1:U`]`coJܭI+!Yߗ;ɀW爂,S#CI~=ݷ;s)- 14$cE oA \v'2GڰB U/4;هYj"G h[sB58*T&T[Rb<xRˋ|PO"γ+ma^=ɇ8t4d {x޼ z:W~q?T %O>z=Ml2W8M_qRۛH.tkD:'x1J~煅OӐyw@}xl5gDț@ZšbB%HD8zb2wG>7=3G۠ ^f ,y#ٸGw_['#q" {HpX5O'el &BAztz7HY$f:o4=nn.NfK诓,-t޹6B٣5R~⋗5R |t_3]^?!wq̅`Hʫa}c-̖яoOt851YI8Y|J-YF ~dD+FÓ: ~18 M"&`m U2NUqGJBf~u6:]AVoM LD!Y7J ԥGKF0Z: 9Kwdˢ%t mIܜ&HAVQw%ZlRME>,sA^ǂԊ"ú sC ̊֋>3gaR!H0EQkH*%"W2n&%uf*j%]}ZE2!6C梋HL&a%e /8%[R-G9loɹJNo̪(FA~?{kMHZHP$?1BDt]QT9v]q0ޙݼlk`+М,#c\Nt mJr Q8& ` ./S9D)kh14̙ <]xljn{«(DchvK=ROvdGvg4.,/&f4+{)쇻WN+8N96Z wt-mC ,"jȭ0øeM3L=?MͬJv%d7'EJ ֲ7-4^X!21=cHOж*P!X+|= fe8pg R-QBߙeԙXu'IͦtV%p&/ZkW't[u>Hp`Z~b1pc106T=:cXLMG-MκMgPWcᘇzFm̵;aU,5-*(zl'GR\i$d4çQҶs5NmKkXaܱeݹҪ\EQ6'F|59ΗSb jnD3]}_~5u/'xփiqdN{_yc&rß')ʡ8tۓsz2LEp~ғ K͑Q(C!)"#GTA$ѱ8'q3|VaHу5ʉwC.%jz5[0j"ڴ9LC"*tܭcoGlƅH7> ;Ic'8-)9=fBz i<`(ؐe\\([WIK(Zs(x6:ts pK¦4Zbb˛]W;$s]Ubt-p NIstdb7(8D)u5DVKdۻ@XuCժ{1}%Pڝ-\[rJ(aM4Eܨ_/|L(ٮ&X]Cl;AVGZ])?<ٕ%\*-rƦr*[foA|ɘc`O{A}JE08kf6OM)a(--Q2'4[2-,OŲ_a"[UC)ׄǐ/8ZpVq0b9p/Xz0K "mcEfwO2l!ig%v<ܟZO}Xgr:IJ 9yNΌw;.gȒDzc2 <ͩ"!@Zk\3ҮnO覬ԃMZMBA9SU !]ݏz K>K wOcel,L#lː`{<+qPumlvC&PzyP):qxİVTnJ_q>Pu7BǍvkx^zc7VtCH?$Z9×%:6sOa,;BSt!JT>%Q;K\gNPa(U4 `1!eFR2`X ͨF'sR]AK] ʝKDo.~ة ]EmS$邧Qg;wy(#J%1BBٌwbOd\<;mݲ-1cWHggw ޛ=vCºJDݺ߸s7#nWsqۙi.I7)ZGZ!Z2 ͵n´i$T+7b7z!͸u^PLp7zs!.r?,?ClBZ#Iľװf̾~ܳϾˬ$N!98'KϦ㑗)<](-8kw3YD2b{m!X@Y>/?^e}~}o^@dO?|>z1=M^4Jkay\YN'#v7fۘ Ac6Sۄ >Ѫ6ah W_޼<. dsnu ,2͕}e</ 4MA)>1 U2 Kn썫^.wzݯ?˿_|pn8 lc5z@Wp_D |jU4 H\q_5wB }+ɁJ@y?C쏒Q&Rͅp'E^aeG!_xw3ރCchlP?GL,JO_sy}/.'iuL4mrV߽vseə~v֓xg(\^?g jwe6 n?}ƧGA|x~:X?=7MͲ[&gNPT)غPPRsn@2y`歷Ilp_}2ӳdx T`.]NLC>x2?ΎގRMb2y1=Oo8:lw0b/?}'P/Ԧ4l;h$r;+[9AIu x?Sx.{M-dY__OH KǓ/1!q-`V"ic#'"oe0|ec&F ÆBp@DACޕK/G9 ¸Y2<-n-?3.. 3ٻ)ge\kBO˜ 3;c^c:+gk'[7X,}V`c  x~ 3!w(n3.9y (E=&1#UVLҼ-u\/)pT{ yR!{32*D?n`xW%&P+81aNɂ D=qh^,!EDٔIߗCE <2V[x1 *PKn#:6CnMhϭMvӭbՠ4ڠRk7蝎Κ; S Z bAwMoxZBb#Kж.Y8zWlzI ~,,/&'&gۿF0 } C0%V)lxQB8M^|D(f}l"f4#N(&0b287HA_r\kIE,n6] u-m9 3#y4/t [)0I(?Y/ֳι Wֹ7zq"uf~ %%on<}SnWea {:^1׿!\J*߮'7_IХ`R0t)֥λ{&2'm'sq; 䌲|<~ySz<6z֡?KK$i\DNUѳN-eʘ]p=]fsՅgEExQAW{o텦`BZeTrYZv {Y/UaKFgBXwЖ[4yBr18"a*12ٚT馦q;7|||!OSĘ{ J[sɽ=̚:NɇOς/)u齹jÒe_zQ':D3|JyG(sll޼&ZkoMV")ܥ dLK2%L(J1pXk^AX QrUH\pZ|-M0Y3o4dͼnpi\9cˤP;)RZk΃r6:TRO]HEkUf% 7ѳ  ҍ3o4sμX+{]H3f!Ca=6[ifWluWC׫jX׫Q,~$:,6$cPS| x.-H+%\0FJNs*M"{۲ωEv*/ @6~{>أ7u \iu,u/-U.CK@_qBZJ@s ɀ'c&r-Xk Xn.A us9,x ٗ?A#=DmUI<@nT(p*V'mC2JӔn%IOGz%o\Q:9s>(u 7@nK_VBP758x.$jy}}ii`mmC0yElRCaؽ0FN1"сcuR&q6r-Sp)ߜ,]s8@ 0;AwVh@O\qDJJ %[I`j'Ѣu͈8A)R;ɞܰPQ_6]Ⱦdof8CzDEΆe^QSSSa gY`됼4mj`=9}.CHA폘![l ziNωkM7kh3v8x }ƟmU?AP-h屋 FaOr!ǫ䜢{>_UbRwp.W wwcsNZ''iRґ+n',(VJNHKNW?ƱJNǯ'p5Ģ)oW>=)84`ÚOikyJQ*Y. |mH16)׷}i![q/-RS&hB\ %RYRem߹7;懋 y<хxO7po{ `ܒGN/o>N}pZ ,er9^Z`eelJQ|T4WVAbTξ%yD32[6`fـ7?X7W/1tu/6 X\jvkIVe^{SɑTfMT5 @l(ۘɩU[e+RŽb ݕl>z~ǻg+/ϧmf̘s/xN[+D`aZh JbtX]XcLkS# Zq+9;P1=-J(`JlfHSIx1]RRoki %J" `d%g.8M "cN9vk""ml 9Soh  N!#0tjĐq\rZrK ے<5g` щ|n@v* x;#çD٫,܏?/@G9Z\L HzySX|ۜI}+>ktBT4#TQD UxK8"C'C'C':\*85CduCN 77S# ((6aP8]4 H8ʼ% eeAusU(.gV]-le&ؘ3+Atm`N=l&RJb,  f9Mӛ>TnAQrj3vi t(!@XVé:-8D& N Z|'̂AiͲ[o*zwK)Zb]]}V>-7& \o᥽P舓_ׯ>?77µ`oc| ?ջm\9(3ܿ䷽Gvvw}=2t޽^HN;?{on=7޲5 ) ߻1d4{KϮW {lx}A5G{4"aAW.Ǥ:\@LGO2Kt#g`Bzs:ҰAƅ׉ƐWJ& y: IVFoyP@Љ3͌~3cFoFqcpO0ΊϘ7C{z>75}ǜ~?-G'X2$8ys싗`5_MG6t`w穻&N{Nb_ب0q8{ٷw4& BB2Njnn}ij3jx\<_zhx5\ٻg^âzdRֲ֚ 8Jp1 >L*\ ޑQMy[T /$X(^qdн5'g嘲V+t Kmutejub>c%9jwbq qv*IA%Umtչ@LK58ieXFөy֡1E\r @Nٳ>;tˋ cl:Z9ޚB&gO|a"ϷMα'B'9"Gyk_#jd6gb98;xWKXxXJpR!{p*4֞o2PXVwR}}wz?cYÜg[H fv``J&)aCTC G| ~Yֆtk0), $,_KVٞuE1z[3 >bqR#-ZRfSzOU%r!M͓7pt'Qϱ]( {\^Abͬ=c)B,^Fx:Xym-FTXEtD#HZmr)!kvi Ҋ>W1acLf~1k9eʃ\Βp!/H'rzJUL-VY5SG p[|ǍV* VUtaC ۋaǖ,VL0*; '84`)KxwB`T3,fO6k̷|o4Xӏ*m0c()C^4k<f˳'qY{?[.[-^ +\ uؖAieФ -|欃]sSo\z!j;٫\Wo}_|y}Ez[{g?/6wy49}#zE\כůW7O&4/w^WsgsZl.-ONO/ޮ-IBEtL~ToӺ{ m.cDg=X_5]>uO.˔ Sv'/w;S$~qM8NBXaWnblUr#5$ }RRC8i 8E=-&q3ﴝ4{i K&w>-&L.L:6jېZң.: x@Y5㉧)K20vAEpq (7s}ش^S9C vjđ6=V^<5b}-^A_};c&k'hk"o.~ކzA@W־/  ߧ mxɿ#7柰NZs]L@kΧ!Lxbz7LX$Gc?qd2IVMf9}iI#Ytfy,Oxxv.rNZчe'A+}XG}0pAYN{vL {lxޭ8`bKLbb.oUdRTZqG )&>:bLEu_]ʹ`meMޞJ!PY AVZI9:b'<.$fכFcJ8_AK$^fQ>$ŕ,B͵mPO (D-vO1Ӈv^ z┦P YƗx[ڈK!cЊY N@":]p6DNoEQr jK^O(84PG%b#S,JRe*/!1;e Z#W^[.zk,5p֜JCq]ƒmw$MnԊr+8z҈2%%DKyc˵`B#\v t! N b"8I#2TIӵ9WB@H倬fȥf?+P\s,}W[ '䢋Gg,(4jEry8 &ky -P/RjJ}-9$D= D]RI"z1; URU[sՠRvqNO=蘖3Q-y)5@Z{DE8^ 4cM9zI ~*9JJ= (W,n^i΅wب:˻ASLlj 4%t7ȵE:"w&͈5L%٥nX ';#vf13To_5 ڨCY\~]̃bUS;+C%h.ӿEo~2aGk4F@kx >zSD΅ޫwqώFWej;&ɧUһbTh`Ś5cje+,r+ts n?ӓbz1ĝOףqdqg pO8ر/+9\GrME?<Ƭ]9N bDf j2ɤUo?~(K;3 K / Tp8]r(M\`)x%ճ{xI$e Eq6m݄|hn;?yJӌ?#wq7+OlaTζɑ Yg5o_1M sjL7X'_h#Ɵ$R2'g{vj{e-(ɀ@cb ;?rO-fOHm sF1XvBrLg45"Yl";'s7FpX|cw:5˧X1d>٩ 9::Ihhb'KIs)s z,GY[#+K e@5ph?_(+SоğB ~L j'>F tGAZugJWޟЅ Yu_3eU=>H1dS>B JƟ!R2'^{[Au~|+jAp>m5tQ=ӧ+Jt4}Pw+W擪9FٸA#PzvլɄ5ʔNiwݿ|sQ{biF^j%Wi!J X,ѕ#%ØPU^t`8×Xƒxͣ2e>/ƯC|I@͚qlB㻍d':nh#4cD>M?gz}"o~zWwȽ@Կ9{m?GV&dqA#dgLE2јRe~^\NJT/v?ͤoh h"|ty GXM˸~:˸~2n I-4Bgg2`DBQRsqEDSeZk=T+Q /ȟ/zob sC{Mz{?Q$ޢOzqῌ'kpVr1Ի^\=nGbc Rӹ1kzOMOI?5=WTŘHi@A0T[# ;x c?'W7]7]|{1քWv$SNj%N?'0&*MI@~-9pr;ɨ2>NHi- yBZexo'T~ۣqq˂l?A'{?+5Ara}APƗApKJ'gҀfڕG(=Ii9wӘ X|B˨PO=4%֡@$BiҴZ?^NDJK|rfq]?ܙB"'q T0zQhrJd*.XINk!XFt:Z H~] A nJrŹF/&+?AWI7:^ z)/=C,~jR7_M&"I4>LgO¥.Ç'pۻ7d2vcgF7B>ՏWkګ|0UQv{zgÏP G?^^b( _=B^L~U?siGOggp;U!?Lzt1Hs47&d(Ggʐ>F Zjk:_bPYKtYWXބc8fKQdAW:_vsE ]'>bzq܀Jnha”-iDc ^{{"Zt -"kPׯ҇U@@F%-r0(2h/֬qc* 9L@؅Z4*$7 v^_T55,Kٌ)}Fk`^GGdQB,1pژ–PE 桺2Vx|;f[@dfk+a=_pr\9{iG6d[FFӣOi;jUy'cԜOR LKUDt'7hW HޑE'ʔ\VIKIK9[I3GIT!x4+)X{= yX8%V#}ܱUCrf#f!KB ; OqfnKO;!{F W-bToCWD9_4qxu910d aB-fxyv17]U&褫ؿWu_'>zwnώF }]MD_s)V+h%;a/Q3d 49EIxߦY"aϱd(0WJK0ja^_O8@53PmC6$3|K5r՝= +lӢgv`U?UQ*`]!hI ,RFF5Y1,g 7kD)t)X>soγ:1/_Eގp觬\Ay3L=k0Xcw/Ϭ@m]->ـ)^Mһ3'96lyɃ[2L*1$ĩ}@'/׃:L.$u|;;?&I= "(tpmuҽÚ]ٺP_U ĀO2kᱝ3ma=0K9i6t!O=-lƺ[x; (gov:w̩B$ӯ>.>;8_9 0;,n+zgzA%# N>;C"b=eP19Э6$C: :0|j-D7 JǞ5+߸ <R3,WM62]5٨3k2isByuL{UĄ*]*kŕ:*"jpb-K'PXVBg&j]2 =7,HI,=)TĚFh|dɤ>BA8c_ב^8N'X-@%7bVVyHi\ ZF:t*(M&4#[<((#AoϕFp\dWtܓÂ6zEM;wKђ*̈́FwE%xGZS| @x1|D}O{3^ b dm Ycsw 6O#bkj<[bx:rBHmQZ륰] #DkD7eWH$]V|9ùpL%-9+w$1!KѼ2vmFY~&sӚg4~ߺ'"J%o9t1|",$=n'30" p 5*~f,6 h8a։-(r,S4IT`X%@ݙiх1Y+ ^u8+ ۊ<|CW9vk'eVPHVQ4 UhyI#|ze/kuKАǤVs"08zEˢ_qKޯp4bCؠ;pE_kaWG-_UgAz0?/3H$S(jed;ak,0-^A'|[9aFM$c|5 U2uk`<3*qZvLNTd4v޳]!\$;ӎy (o<BK*-kTv l?Sf0qɴu2M#FkA%ZE>z#/E¹R..y*nXW<ijbj@IK|Xc{]鵇Цp._񿜼/H)yy/dA.݋¬: .Z R4o9bE7N$QݳE S7ĴH>3T0(x '}6|X}x}ud&ףazU^}5BӎK)h:HFJ Z/]dB=PgSniBe&(EKňNP-uщ.҈HغL``E켻wϰȊݬRhz 3]~}ٟ˯'`A$}18i*ηm~P 1DzځDʁ vMuNn˭b5>5Thd\0QCY O|LAI7p@}!mGlA-k ~9 (ƇPM8ΌpiH2DЫ59={0l~*i I>mdH4Xc.t|n%$)l#0O@8is'\n22`vc{pu/O?‹1΅{d&ƙT ' t90X4C`p dv&Ò#8ݍgl8 [(W uf xP 1nfY'AB!Q$W[k1_߳|g* y[q_BrZ0V ـcVpj^̷u]96Iyl”znlލ"$c~YO{ދ90/]Jj}Еw1-^*U~ۭe@o"ށ&oT~ Anfp ?U//{Ed}nr7y娷U >*V^ID+%4|`s#t7kWx9I yqs/~҇w4/v;Iӱ_2ukX`5 1Qޖaz@ XY ϟ~5cGbMMݬ{q8~5 NsY2ś|w".7zOZ)+i XJ3Lpnsg)Jj} &ȰiNCjF}ju!j j"$m;\V촘K4I33;苲_vq.|Ab731BUH"#=H OU\F-W2BtPA$A*7s@Qc?\$쮋y8.wD׌7zq$Aho}9jx-5} y8tVZm]ZHz:dӇE0Y-ã7_yymGaĴ$ I!$ړGrKdA&Vt|%p":)":)r3,׾5BblxwdiJ H1  ~9>rHOgަ[փ.zvONr2_L]>&PrGtlмl SsiKwu~#5BVqR1Ҙ yGD`a@^6x ,b(Cx,yD"МS:d'yDBF]}7v MH.pvtSiQ$&RU)BhQ4azwt6RRlص37w_4 jonW[.Pr.Y0H?7IAA}a1_a=“1$;dgakcOҕ~K,l;gn2^ f9*꒣?9eBE.S@OWW|8WRL~?W8|o~pbhN]C1POIMHH $?ډ .iR[`ı wT[k*%ՂT\r[D͌a)Xu)Y:Xa<S Σ,|ɭ`o:piWmAFYv$pcRDGQ1 W'vu )ݬ_հ0;YB0GʹD hWBdY&4iXm1)+1+Ms@[2J 1L`l7BHD(=ZdX?Vjӆ}N꜠k2%1RcxOjSl2# 3rM&H #47 ,3&ύHQ$zBvFPEw];?vT8B8GМRtȥV'/c>U򄈖ո¸b|_XM䵻5a?d6>"c y;f\ufyfIY{@?Mt&Onܧ /?  /;y&}Ik- C"ia[p&sHtrvH^oACH!%?\ OwdWavД2mDR;02,1dVK2rʈɐD(!TFP9;9RRR!<+/r[7#g}0(eg`/֎هWOfr=׋#0(rz1v AOT]HClǷ |QYǧ,\% Bb\baA"[FVXE*]ÝȪBM16 6ΙjsT4< RogJQ ؟>/</䡜k3 7oBBSݓݟ֭~LR_ (UK_Rޗ2AٗxӇtSiTc Yk(ιY(͐#$)hi(lJ=;Pg?wПaQ#vWwzxY1fo4} `y9cjv>af56VğQhczE-DW xa4E5nBmF FXg5D;{dhQy{=:wk6րظk85!Ӆ,!R@i%NqJ*kT? HITI4^[<-EFz_ F*.+lY1*JÑ#vm2\tyǕJoÏ>ZӍɳK %S/)-׊og-zO]$>bY{u1 #9+AyEKUɽ[#xyB({B緦ΖD,Sz?RL@KvN9-L9yByi@a:]/=k7b: aK8!㹏Ř=c6-!n~d Bac!LHreipxx=c,^gUp;Y՝j&k)Ta,Ia7/Ov%n<~Mۧfϧ Zrklk&_v~ҼQBΨ|4ɟ5Nɔ\/Lj` l%;?!<^.Ί)*8]x2줪N/hc8OB&kߛ0rx5-.em{>#'nk8ԄgQ&!Uގ%਺Y]:YV=&, :۾UXHnwך *^/&_26ll;ۜ,7eT h_涶fEnD;ͅ%Cj-WPOQp +|MFqa,dm4[7!jNTud;[yd,έDZrMg,-p59Vu1gMC1.YOޫkAEׄOP~8ᒳ}5e27 [,ԉ^mOɫ!Ͼ;W'4]kSـ`#WpGc_;\)em[FZN1W hםi%m\̫1ֈkٖ](e;0Um#ZevpF_jIyu탛bE5^]@Q`9i C1`c SR.sS 3Q% ͎겱^0bx' <ьU;bOsB=Hta̞1NIc<T<#8 Q il 7AĊJ/p̠D|S )iS3%}*3o^ڊ]ھ@Ba:5Q?kD^U44bt8W/ :""6_ON{ c~upz \D+$\Cyi'.^#9ϐD'Hk_mE; 6`U,,t(PGY\''dz7w_4*jPДچ)rUp]n:"7a 'v5ℜh]zux)"S`6L^"ת1|/_RTrB Uy48^_7/B&/CBUA !WEv2)u\zcTi`pG+YN%lt`1L _a>?U@.׋E.ؘZ,u,xf &2h ̪1_x!ld-A@IM{)X+=qaͣSfw i:CUvWh2nOm=>"C!B賯Ct ᳟|veM y0?{HҺro5*2 kToϲnҰ{}{\?@.bF۵w{rI 5=rj+qz)nf2z.ũ"ֻ*w&G3; Ag#<=C$¤plVѬlc6Evw4ɩmm{u|MS\6GCZ޽ѴwFE$Cb _w__{_ 38^SLx E^R R"#HtAQa4՝4 z.7>z|$Cݵ?_s,t2?Ϗ5k,!Z"UGQ;uJZ,RѼض88?+JEY,LsAA2K'Z8/0tKJ#$"lXM E$"X-{_BxELVvo`qe*=2 fr8BjW1.r`Q'BJLZC9ؒI#)} C"WZLWlt%5>kEbmH&LY9* J-*pz/!h,RkJ[Q XlϺE%$ASe-21,A9xZ}Mk@$cH=D{?8D *x-}׼AwOg{v$mz*նsA4 4{^r=fp@90OgL =ÿGlf6q POuˇi8E*ʼs_.ST5#hm.G+KWļӇk{773y xƍm ƙ3B!6Ⱥ4Fm(̠: '" ْi3M!gQ)Gp) VF+ hb<-G? lμ(S5olH$8g9+, q WH[ F4[QN*fd%,,j9^9@LZ˝sI Lْ(ZˈݞX} NZZ+%vOϕP72sd<lsD*'ҤҔkPLF9WsͽAZ0E 3!i^C6$#wH"jQR D͵7}v7PE,4+H狡co*ᓂ]!4O'W Ux|0_՟0>KFE"q,++JcQZ 0lbG@>)ux|Pc2Ԁ S`p%$F\3YBҒpV`V+_paag1srʵRc yMk#Vxc?YwO*r_Nj ߧ-k>j}2!Iv{7;|m7ǥ4|*:*)\q' ~ 6UTroyB\IsȨR"UP29PD :%Lmu{`ץJ[R#@KuJ<$&y1pFKiBhea c]ʹ/ؾ{vAg<>)!`zB旣e{8rdA[AoYx0_P˦K1X1ku^Ut,KSn%ߕIOU"܎8b~VEEzهxJ9Վ8T* kZ8UÖI,dO-p©FؗX[Ŵ}V1O.EMxOp'^@ ȃE0JaE4-@͡R侴ÇIQɚ<ԇP9Hmk kdbS׀Zн\KEp,]c q 7<*2tA-MV)# V%wʤ)$bHUCU  nMS(I^D7iXI4Vmv6i\i*mҊ vqjK6=4[j!#--wNz'UhHK1Hь4k!aǢj2̰czԎA fL9l%`(s@\icÀ= 3)e  dfnCoCEjV&cMQKqwGyUenaL?$)JBxC8%k;L{__'/ @YvOϷ5h!},@b YC9 ehO \wQ`"d݇Oa68 "вKX'33Xq5ӰаV"QDrlHҥ2xNPy jԩN\5u^ ˆȰouA U,8?ݭT*F[֫87yt,kN 0hBXK/v/\?}nqTd+I^@wU" "iF&ƅ"rp-*J/\H1iaF;e$lbWnrW<|NRL ź6;Ti6& H$#kr"|m?TK{Q N 2zWx{[+HF2ib7- hBzߥ g/U'Ir;ĩ`h5Qu'氓drѡ;Gc]]GGK'ePG@Dm5vK^I^?FRWUf#)TON*˩܃&փ9~9Vo-ACE|zs#/'v{ѣZ_)ʻnA<0|xI%ť=| I *}i*"J}#%.;Ε/}|!֜s드sns$YEK%W헙|8;յIɊkIi:#\:+Yݹ*Xu\[Ί#nW,_f>`PZvBn.e?[3YSդJkedPc `j Rvp8T<t\V/&; ;j34\ecn,$.Ć"FdXY3R|CʗIȞT!vY8qM8kMyeԘCy^qM s\>:wNFGCy3$O zM⭍NfݎR<&`thcU M[R33Nӂ܄/ކ(#q##ɕZS+ LbJs\ZYznP [ud:^kqu֔ǔLX[C XTHQ^yB8hGܱ4[=KPs8d 8wq;wP+9 RN1lTU䔡̂s*5aZ042r̷G8*ElUd_(!"@Ixo*lslu^1-+3-LɴXjF ')oupFtl,Ktb Kp@֨d(xL ~3Gh(Vhk5!}&y![֪jBH7wԘ&9elH=fT"`(-(i^YDT"D Gm,,>A@BY(Eކ Y~9 ;WӭWXSj[T?A[#f$/rz#2hu%G,fւ\ERXH'bp+KH ED. FT:3k'U)ojRJ{}P<êNSH1t8,Lտ#ni<:c-Xlh1ռ3.pޱV T{pPb kmn4k[%bҽF.*=1al>=2 [bbQD(VXT.|<1xdվⱪA [VS{ z6 gF%gz3Z%7|3zxCV$SL2d?M&Ѡ&7bƖ]JX`r'XV=ā0j "/UU*܍FvTKZ5ﰝI .ɧY|DmOJHy7DI F +{xo#,>f ZZwd=vԓ4xƑJ'w+J'*_19p"fco]_?nШ_U2΅Տ\fRt61m=]x4svjsn͜Đ,Q,QcDT2L'cQJ*6\vd4x|4,W3LZ!5"|(emix85Ãm"Ԍ5\qIN:\dttK4(A7B~ʨ-7^Y=gtWbqc"tA1@% 4Bc2pw`BR"A\=cyC:pZ#uKb*c %| )yE?!<2N=vOT "v0[3qJx{ x18:NB7uE7E`Xnfƚ1*Plݨdb(qr',q6+*՘Ib$I.\di+T/m|hq Ez.,vM^,ܣ}ֽݭٌy[C7HQ0J9 `^ G.xo"q8gOϳHw~Tx(>iGwW1BY%(-W+kHvXkN2驮we=nH2;*G~5cnlUGK57TJf2/ͪpV*>2 2"pAYW\-+7&|JbsZl-9s r^8858-[l%1"tJ {}y\Ib(6Fx@-$PFFf2M)գXQ,Dn 0R^ڍu52K$k`@͋e"띥AP߹l[ oTJPّQ m@˓E{!'"=oq8Ll'cjpuyF]Q𯋫Im~m&I^QeReI;}vw/ca.1Xtne6牱<ávLzuqdYxi)u!X󅫩ҒJx wBS̽}n5BհfsRo![M99΅vC|6^w෕ un+97Z=Fx}^nJSPs@EʑhցPEy0tiУYw.Ԣ$V>h'J ^/[gvUWwT*!'v-hB5Z.+ZJEGfϻϑBH2 L8T]jF$A<[\dR r:=~P. ΠmcɍGtmgiȶ^A jMdgPgsXνtĸ|Bqk({hV.{DtjdMoҔ^N\?Փ+khwL#Zٹ;t'V' IDF#V4%L8mJ14pL$%cji%{׼sB)T۪`JL#$ DDaD!Q\Me ++Od'u>RKOw֘O<%4EBiJ"SJ8" #a8J(4)HU :3ߓm^Gen];;t{e^h8ORhhwC w6L rzP/\&{cW Js!hch>رצqimLH:ٹw=KϧԞb.8}{r .GտJ mE JI#tkֱFks82EN(]βLO >iSm f8YS9 M=C=mU=2SY$6[ti,xK3# 7G9_ _?*klij-1H0PO+M[bWgZ-1|[ 쵵Z!Lh"p^n}Yp*Ӫ\i%Tt/ 4wD߶ImwaVg}$~\.ǟm Oskrf^ ˱$rM4j` 'IKx#ƬoRϓj{__$2&gYXl]f˳OS\_U?3pYFGg~||bhY^gr'n}oYPZ>-EWha;0]΁_$*$oVش76mM[}O[yCۈM{)q(KPT҅6ЈPp*)k(I䟶?3W>H鳯Q hDždEO X=Fލ7T/"23R` Ԥ{%1QXHHDr7c&!crhQp2Dk++saEY6_SHju$%!l*[-xB|ayV#U. -6_ԯČ=r7| WkRuhs4qԋQ`PsTԅl\0 KvbXB* H;hbމ#:WhOhNBEmLo'`A "9)E9uޫu1y!ʉ.INQ/Nz4rMo*5zH=>F5< UY5X*sITEl$Њd j$"IVXT>/ɵ~/卅x㪂}$hToHJ?DaiN0w_oO-/%m:MJKt>u=.]m5ɟ${糦sΘ~LVSF[!)t 55ϾqL !|vO2GMI>{⬲Cc.j ^-uSTVgQ8;{J 5jʉBt@u4L0Ξvwʜ/cC_Ү?߼"&zsf>_Yά1WgY}QGQUoWZZـ{[v*Pd&HOfq9ˇ,g~.uJH8U# BR$E hPL%&e'N%ǡ2<_}Mh|3'ҝ{TBq,Z!=)X%Zk6gXiZWSN2]"du;{Yc/O.Wme}>m<$j1!Baڶdmm"ƒS(jS1`cּKy0` _ }bW ʔqʂPpE b#X0_V#wI77fe{L"&Q4Ɗq*B҄$al)JHtGsDZu0*6$҄`D CI $ $BFS1"(&Qjݝ0LrxPR5%PӇCQjءD#ƵajP" ELư"0 L# Bs2:M9\}9tS7/C&VlVl _*J #1 NJb L(&&$I<1l  Nwz}'R(|9rG7xbVUA#GA 1Ĩ*⩭H SU<_œ)Ҷ$tXS),'`2TdEQO2 *R3} ;Ul;$U${x^0>$l氊<}GxQ?U<ٯ BWٯw GWwGQya"`?; _ #pz8~|7LJquJ]JI/՛h-z;la_<|l5I<]^\:;Tܴ\{kɹV뇃m6U /lˀ?v<{Vǵ u/s.iZ<:]TS3(4&B/6U`X\_&81Gx҃B0M6N)ONfݡ*Ǽ˵jbIeFjH]kkS fOo[?{qq[(*Ժv ٨ٝҟ{;QpV1 ܽIP/4ײ*᠇QdJKz^8[&˷r&/Mzg&g;f^uBQuf1Zj^n?nNApAm 뫃 Y'%yo> kcw;ʉՒ6<54]nTͣWtzvO@ TzFcuU3CYE|mQζ ٶA8SG?vC˾G{g@ٵ{;z:sY X;@w Defѝ{CjdJ T]ܵOamĹ3*ԩegG8` j`Z?ՋV"'~WpZQI;=Pd'yL \ 7?]WGyT(^⼬߾Yt,z晬}JhvVYW<#t=<Cr{|\m{aYl>df8]` /<(wNnnuo,0^1i\ξ݂BD*56rk3ëʃ7DJOnF!y(g mdݷ4'HcKTrsʏ l V`RB EHf!E+d$0iQJy#" ۲`IvjqLFQ ?^E5EJ(ӕ)e!G K6&?5DP") pEDIh HI0 z r6DsC&4@ iзVKQ!tsV34Rm}H#ػn$W fgK,S @Mfe ^ݝtNKOv1}}RG"ɖan!Un,Vퟁ^ElN$\i MPpyH Xd[@ \n\%O'  -EK}T.=ew.cUR߯%5KMbb-h^Jٺ81~=˷nf|_i1C([*=^F ~8\/g`,7 *ψpL(317[X3@m Q_YPH ڟo:8v9ixd8nDS!yllDmU ^ëB {AlUF M,>QFi59K}$ccbk TAmdkˬ4;Z^k'n37sn/=?mj!mV_>Fr=|I?ƒ< 2ˊ0z qIۺei۷8*ci'ݼjk8phx-T\xn\L/v̉mQ1F1'&gn75Ǘ-%pSJtXzI%u#ow;jHeS$i'OJC0lG좂18㒢ry4}XA9's9_1sg߬et}ƈ\9%O=L[;tEJ[M/%~!=aeCuM'fFC$)xy G#U {@Y"O/؅UkۨGMD0ƺp^A `@ڀI4Ehl*=<CeݲGHS31XԆvp92%a"˅PO*]eP `T  4vgh; $Y_gkva?[6OFm?8y(`G;T(znh0 虗ۦeGXUpy=] $yNKI٣e1f 0S)#Ťv҉ל_ hT3g $L}a^t2p&I'Bڅ0g"B˥o [>cx|YSg%עcQaZPҾIth۫UJF/k@G8VKCcckj蹴$C"`Bi>RbbA,IEXɊKMܪjpЋbSebEI- / 4jW ֪􁪟ڀ:|3ixN\~zE7DƸNR#1z+Bl*r5k3Qu^pI '5g jq+\9Y!6dhbZ'էȴBһVCPhH$%#E5xOFFnD" $Z~C׬W[.XUoz׋oi%% J{'&8D'$'q B{~# NY$dB:+ƒ`!LdafT$K乻`(ig$#[͒Ns2\g-)R@+#K-eT6P--5*ʼՙ1 :1:PsS$cM@>!#pv\S㴱n,O H(\܇_tybyJk+{H=^e~|zk:X!=?OZ%I,?;ݷOz`qo;W?]$==Cai?џ~?]__~MxswW߹vm{szLp 21dj O׿ M/ iW3`hol 9g=AY(I+RȌ3o|H'Xk V FF lTuַ&|$cƘ2 ^ $4bFH)胋Y 1 6`}{V&"ڃ78+U~R&2%$޳"Y$t)9 ,:fH};-2 J )Oxqzׇqasqz8թr]`#-1>֎{ 96>у߁:h6y> ͮ׏צ+_6zire7^LRs挻2͙wn=YIϏɪ~|==d$Vp}Sr߾0|lXaߧ@gt|Ĺ2庇z|} fz۫ï?g/U'y2H$Bl''iDя JQ쮖B. &r9Y[mm*>g?Hr!9 >R%IaLpIO"fr:sQ\]6YIɔtLKg4KO"'pFThbRYC|0mN 00 Va6 2-$rƽ BaT|舂PFC/ax9Cv JpE,>2HE]˗EXEz4lIJd}:LFٳuO]{0m5,Dr#GLŲ=Gg#IU\+AH5]0cPxxzr<5fgRv]V#R!&Uh7op~Ӓzusr Q9T!tPC N@V<@Ë{]|W!_U ecrmMw|:?h/:AuvbÓA]H`ՄhUrO53 $xCuF<6~'`:oxUS%iLOmRiz2y43BŤp3mǏc/.+ђ*{=+wZE']m$2f'9k]#,  cХP֎ޏWkqV )~ӦhM I OOߛ1}~L57q^j"͖1TC;Z> oWe: f1(o4^&jU{mT ?Bȑjn_%:ۗuRoGl_J9e}NUe.[<|$]|j\r=x1Q_6ƃ3f:rڥ|xzryp)Ap9ᵔj><=<)sIX&tGV#_4FO/k@G8V#(azKu ) WlKQKKpb4Y_C$ߤe[4/5 b~+ӯH< $1mJ6Mt:Gg3)HDb¡Pܣ%Tr 0lV-D];^`Np1 v/oVsM T7x}:`P>n[8UCe!wrkђ{Ofz+Zk'-ӕ[a*qĐ 9eXs'ϣXN$TM[ u:@B ( 7OcH=E(RtdwS y hiB㿤rb~w u 'ٻ7r$WzY0ه4+Ԯ%E ꪔC.R>2 #2f ᙍl_S` 888;m9AI86i\q+)Ud@~"x3ΥKk'`K ^oHQ,%؄s} 8 mf_V~3*ak1U [>=ZOwߒkL(j+RǺ y>ee.}>LgKֵwR3+PmG4 ®Ff:ҏ=~+x-wJkʹJ2y5zJ*q?NHc 5 "(}33wF4*P|t%mxeH.[6I$͉`Kcx.W><1Tfy\u &4Wy]JCЊ4ּA1.>鬗A  ˱ \2$ c o-k-BFBrO$14Q|#2L%|v^J=' p^\hW>s&p\ЕRYYIJ*! Sώ g)xyG!;} 4; [[K6co/UNҩ*R'}2!`$8(4V]ғ݊;'\܊ҘWr"eɣY"CǓRB$UYUU(4T?yJ0]sĄcOV!58`B4\{hZry)8W&q tj4>׮ Zĵ{Wk} |Þth.UHn]iaMQMڡ ]]4e%XMԕ]4jrڴmj1q]fA !&IڙӴ;kJ*n.q4:cL 2j֜08](@}QgJ ETz D]ԕjt%U삢-gyMFYC!--cGyQ=/ЕE(g%6X 7׷^m١ouvkoީݙF+IUԆB'unW=bT>)ޠ91+)JⰨcEX~әO??tSKm/3L(et_q8>[021?01l|`ɜZ)so-VBKaU)Mi{Rш*鎊S76mÔl7>xjQSpݷ0'̡ish=!;ToQW0'VHN$C_G!9̈U)zIkJhJ[bիt疣xc*[B=*N0@ʍF0)\4pf/\?%H4E;,ޗ LpѢؖCh_v\'A!]t.67zQ%7σaaH\n5a  k:d9.Rhs`f !Q7:L qLyzE;ۭ5tI5Aiul,&&,0b%t I˵̴d*XseRhK)/ݕ 'qX5Tu$fcD=kL%9Z@d򔢞 B+ Z Xa\쩃SQHxJgvJKIQ!,$0I(ݫ 78/Qe&0gj|U$5ɴ@XV+ Gg_^b&e\yʜǭ1M{)$ZR5@D))\ lia;6{-  j،KX Y:=,v A}"(:V Z /(SzLB*G8%v9u9,v%B&.7q鼉KMq,100j9dvv% Iګ=>*1.r&H]ALxzY1mpjՅ U]¨f\^4ݳY[O6'x/][uAt7 oݓ|9HRXϰi yM1ÈhpB-MjW@kDoFop]0!iC[} 6ZC 5v@]6~V: aS(,e亊 E͞[.5Р c\jRX8fA b}Asi-B1Tiה;p1`ACsT AIb:@^iea2\+CLEaȊ+j;T@40vM7PdFmkۙ{u9'p2-TeeYb`f]*;sSl;V )m"ߍ`”E`a̙غ7 4Z57A,ąf-ᜮ pO/O.Ӷ(˓ˤSpb4(EP6́XG;'DC,.hB / H3r_(&L6t0* Gs=Un .n+^8 qP)eW04,Jc cmA8F &Zx*F3|#e \o{4~_š17?ٸ9 Qڞ5~S+<7Pe~1"NǷ0ׇ]R(MR1Cqz "_](lJkwSZ@%;uKt(:FX*Zֈڧt/n;꿂|*+ytU<>7.KxGqm`5Ƽ;&&YX79\wzzK'Q%(!Z88 ʔ3[Gc|mkItG%,0ŒA7fºQ1\F\ #0q,?~#H2+#2k5сIڪw*|1:K;nzh/gʥ+,ϷҼXr7tbE]:ƟGfs!/>E0$]Kc#yYQX6%͞}}9%ﲡ9;rW7\5w%f?Jᾄl(є6 %2 aC#WOKm@@M)( Tʀ/l0.(1zF}ج~:e se3)[W\8hڌ!2m|pYιP{J7| tΛϟEtHb;H$N Wߡ0"05Ə>L֢*&5bBv`Je>]dBu\Ci Fdo0sU\AfK<>r4UaE- ΘF9Og L0 g{ޱR~esR^Qc1Jb4>"HgHDNq8 p|(aaM4w3exѡ_ݎ/m~p3Cxm8'Hv[;\Onci:˞.m:l7>$k%9RΕ/ɷn?QK`NHReDX6!m"㚐xVTLx,!`͆}1Ɇx~^J{.JzFY{|Q pm<ĸ0׌Kj\7;IkiT&xv*$ :3{zēgzF_rbNߪ:@&`}=Fhk"ˆ$;Y?EٱiKH(KAE{WɅ9sL%£G\xe$"ͻ} m9W]7bsζ]9΀Ҳ%l=ZCaCCf5D߲{R)7/bD/F־@mCo%VhqF %'8Xuȅts_A b,n@59Iݤ*9*CeJ(@ݯJ}Vnn\C% (jP]h ިh$20$y2IivYKdڱ;hC%ӯn*R[D;JvJ=w$/MUƖ #\GVw,j=sLeRd,@tng|˗TxWyRb5E ̢5Bn3'gy2!Zᕩ49+13N:jќ@BҊy*!SP![ U+!{ݖvi$/H޻}J,Пȇksorӕ*ʧ+rJJ.%]a ׸x4|nmų|,lX9G^[uN1Cۏ_m-VocYk S'k]Fu*@ov-;`[(Mi܂XZ@}̸PZmw̎ Ttپ< .t{+V+L2-z._W wٔCu'*z|ӷ*k1@ Z|PPcy&Gz7i *X!iϫ'TUU*q  Y'R<ܣg [.u*A/궶4]5UpBZد:7LҲಷ[6\Cu+{FXA>=%1fUux~/vgr5Ħ{ LDOQ-%NGnRSF,}%n4M"R IRO|5I*bdPV Ed7Ξem53eнa( bM(nt~nt<)nJ?(jޛhB64}ղ;_R7c^Znט 51iܰƎ ԲeT07`08ǭԢ/́/5Rjթ/U`C k}]60D/"0FC&cD3g6 >[nL;Y+ξ^RexK`Y3'0#څ+ԫ2;!n㖺0@}.NnN3c7YvGn+e5qB2ӻAz{|[BJ%]<Z5<VȜ3fmPtKr*hK[/7}NUGNӃObl!Y } '+)']Df3i:QL ԙS9t!Z eҪ=%Qv5?P' ݏ* xWkPgW5zc/Ā1nĞ^Ndy>;V1¯%i7Pά;.Wn6t.ղᄒBbzL=}\=4%hlCa{-I ቻ]n?{#z=f΄zow.ol0MtPqxRS&~ܯZfPG}u&FC궿t8 E"K*E'm^ h+ɽb"r#*Σ - FsuEb$(߾@|4Ee~a^)&^ntC Y2U|AYDT/eTv5~;ٯtSbzns&_z;{,NHn%xk`:˽Q0uU8 ."$+\lb!UZ}͹0 s?U*O7_ƙ#;pzҟ߾9 As2#BV\xA> Sr-pELoXM,BNݎ)yZk| :o^W쇣O.]w S6^7gCqbR| A/O(CE]/iAřCMV갑NeӀs$uqcL0 kҌoj9B5]b7]K%v9PP9c3P/O:Mo756Xf/|M4^> RlD]2m;m8;գJXxF!G`ݪЅ(_X։EF?%7y{eܑ@6 W kbޤz=8s@1Eb7"4Zp͟9C ϗ@B#=ɇQ(G.NxdEUGZSeI&&#jD$R.t&p{!ޢ̦dM\Wj$Mluhy"1<xV't`}֬֬֬֬,K>ޔ;BN"R%A x1#aA3:E`*D9_Tlgk0 'мQ+MƼܑV>cGhaq4+@T UP+,IPȝ$q}ʭW)\':DkA K;Z]hs|(@ [!F#%+,5i]2@ǝW,'Y`BEaKEkZC@pn3KL/9ߓ^դ92B;dܫ :+𪳲W]M&,5֔Y:NWpgjX>C _)Iq+[``*g,2 -e-Ti nBYt~qn{'D dֻ&/%j.[Q#9彞 9aP2ē9TW'ű<6"e[&sp^=W7U(]8WyoԜ"nȻ߉wo~T`:ŏ$GƺKX&4E௬ӏGM[j4(}ʳcoNc\xL.\,o a`Y$4W=7r4L6mTibI-"sƐh1JMS&ir\ (,J)H">ղ8-+鶨/KRMw/-瓩bn"<SZX'z`U-| Ԃ[.:[-kNA'j\iU[ME<\."YdbjlEbeECS# K%X.3 y #Y(Bh|U N3rj ,%>YEQ+37(D:{&DC.r<ȚFm1`X5THYEErBߑ,EgXItpZcm砶*~\d.iG& C2Q I_8rAʢ,%Y8CʆӨS`wt $NOFA+܊\Q|F.z HFSL8l옢ҦVz[qMI@Jx0[1D&fg-m6( rcйM#۠=V$f)=yRc"ʳٻ涍eWXzȽ9%R/8رʒJž*, E2XǕo@RWX|Hp|=ݣ8%m`KRts%Lb0<@2)$xs T1 *Q3d^ Riٚ[IZ[_ww?@3M\S٘ 20^ HP+,os&p7Uh qќqCaŌ5Qn6ZvW_c+s},/[,/S52ʪ`lrt]tRĨyъY3듮3s Y+>.I#!ƤB1t<㌱&jgj֐K1 f 2`'G!=hEX;53Lr~?2Q|܏&ij#?ʄEҹk}T@"4JM'+&1!`%y*nJIp;I),Ʊ(Vtf [0u yu -1;@Kə@( c{p '@+JD='2"= nV- *3NDSsB:03_ri"T du~i_f9)SL@="b(JRcUw8`syZЀΨ'WP%MU1Ra"R(cÔfqz>5ߞDӳ|XM3%T XVܼ]8sKyaz}s?# Z0W‘qJEApwCDp.`0 !SRLi5;.DlzsXY&W 0Q{c*/V\PoSpQgXI+ͲdG'߼ɳ<IΓV8nq}\1:q5W`d8Uy,D|ʐBn )Qnp:^]XɒǺ ӺҴ5%.'RFwKۣEq*3 6:z{߀ua9. kAKM"9`s=qM !Q9֋ $\*iQ ҥriU2sQ̈́fnPxeDA%ަ qYX3Dۭ~W<>jǗj@^|#ya<>qC5#Hc3Qg8]'=@əg#= KFFs)QҳqC1E0,.:iLlǦÞqj]V>iԆaKO v)ϢꡛjI9ՠs,t(;^ 0E@ HӭacW\vT#eo6K?v7CAP+EX:eCŽqؕ;*iK6(bt3(Lg,-Z=eXy*eCEFDph$>O$Bv)UTYhm9iKL**JbQm&!fEv  ,UiUՔ86T$wLZiCl`nб0'OW*1i&`c8*6 ðQ@NX@s/ Lu' U2nwE:Ӄs3!еնg=1/$ycgr5_y9|eǻ[mv ~xez,ּ  ˆݻ<] xduOok*.azq&~KDwXIT.`yHUW2saKLQ3afx37YzNz#ng_ԏ;? |FFԏ&'-}o !mFa>rn"^؃]T<рU=p&7ɐsQy}ݍ ܏-fvgۙo~|EwmHxh[1h(\6|3 Fc̪6.^cfsZ8]q)ID7F$v#~! Pz14jz rjY`?*m^x M:ZS]Mn3TP25ĩk5[9S}\:xbXcu)q Iz2J:!G@qtMZ>$דq B/AYvF&[+̄o-RKe,GB ҤqimyKjqW#m2Z Pa6$x\@NhY}=(e@ !((*}pXxI"K:X*QPU%Oꉯ "Y %Mչq)DYˋ{Mչx)AH9`Q:Xl9ckmZ;7^޾)߼vAtZ5[PNHGy.KZi˱YֱK=]vIҨHuM ̅8"J  T ӹRL>;ONr(H\CdVɊ-4l7/VJ]vӁJļ3t0yvf)`.ҵ&z!XvPsا7Y|L#@*ɰ$ھq:֔wPUtj9@'yBrHRaxz{ d1g`L@=ʓZ[eQqRele5s{o Pϯ_Iˉ ' 3)ݢ "/j$<_+rֆLZ#kZ/1[R؟cE`XQkOvžSpGFHy(HMw5N֩5QƟrGXIZ^+܅Ln4jP/IK!-@nK ;0JȜx!sՔMQDW/.kȓMxu~*v%+S{UOs/Npt6H&[ZLɶKgMG7WXC Ψq]&62 dVKI*_!A(⇽i)0P^hdbY"eAkKزXs=R"o5eb*hGkr7AqaQdO&<-[zČHGTΈ=*eƚZES˞gECQCXЛ]NMU<Tij/x+I]'/%'_5}r2{3OxedgoDqhS(wx} -hsPhTn1)ݲGՎ7lI KǤͮLn OZ$^ ,b'E{xAm`4qOÞ"{xQ"l]%\A`:g RǒN [T IRbia"HpN t@Feu!eO MJxXtILw| H䂀%Ԡ\STIik +-Bh`ɖHhYE\"ᧇNۖ|t'q9?DVK?0ܵY(zфJEU˘@)I,1M v uq;#C%j J,^[6wNO0pgrLe kh'x.KAƄa7 eBo>]L駉Zj4{F@퀥?70[o.|y>à&˥ u7?GUJ /_oǖ(tx$O;2^hq|9Ze9GG Gs'Ɂ]П{9_0} 08$,4~s(X ~u5LK8b;8$Qg$߿ؗ<e$gM_Wl`\cnyi ] \] 0.j+NI[p;ۮm1Ń qL5_00~);ZK9oEd=<g1sgͺ<Ζ1OzK}(Zm6i~ ~Hxh= w&ʳ jbp"`=aCgj Ǭ"}6<f{L6 33g&''q{.\Ug'ۓ=FB@{b?A0%$%!RGr]|si\LgJ:.m9j3X[kׅ`%NqAvֈn1l l$ C$ gwB\ {?qW$=fm'=.Us_8 ofyFj6[o[E  ߣZ&_k"U'fBiFJսc2a-HD7_h&xWBGij=|ĕQlq5[]m۶+~9X@>$iPܴ E,(k4$;eY˲8gHyvsUp߲t,'>LhsoTVa,0Z`#JKO1B=* C PV6hg=΂nk5z>x?2x=1N满2_Y'MMvW嶋M_uޤIz|qɞ2E5<2;1J[=A:K>nf푂Mo7Y+}Gs]M=Սs}#tA`u" Dh9![wc3ԋ(m暶5*X4lDrͭFg[;*\A,AQrymQdJNce_1"-}*9ĻFuBٷ, jU4>oN37R|1kq)*OURS%;KZְ{йX+s Sq1j, Kͫ/vAU%å?z󋮛{yaݳ6=pX!ɐ "B`(@PECDL!H !dJR]|;XxQhXڒHK\Ҙ ETM+B 9ߦI11xFηnpߍUglGyfpi嶻]*Ɛw7o vXQoݼs cN$mfBU dpȁHUBQY~T3F_x`J1#t1ڕ[remi-9 I-LA^cMld}r@*4HwVyJ1pV'fS,D4BWא5qƏ ]ȧ $n G (k2jN(ܣ%'x6@E`PYH[*4sN qicNlΌ9.t#: eSSg(JBA|grppTÖ$GCK %h7pKta~gX m`c0{d{2ZM;v^9.xz?[FmsV,vR)|m: ALھ,/XݒfH|SUʟ4gڧK&U!| 5 }B@ATHaO !@Cf |K[SA"%%|dglVV'y c|'xBɡ)!9*xpxK$bi#qnQ6]a:ҋRxŐY,z}ӳس3' O30 zڛ"i۝ _'c 'RzD I!x}ΣnjeW~OgM&B]ؙqlc۱CKydyPn\ȖO{Fδ1oޘWh6vcJK b9^363uU~m#$,4[?a:]+ x]%\ِ'tuiiC,2fIe&P"lQ iߔV"b8glP @ -+@OC@$( O@*[ɜ܊x)`پq72 ,:+]2֙TN6݊CZ_i)1X '9M|[}"'lYǩeǠȅXAgUR\1hܔy+\'lz$yrQNK2SGbJ$i0ȳsaV;" ~@51q!挛B F>Wf:HfӇE g` |/q`d;koo zhW!3\.(AݾRԘI[:=Ozy.Ӵ]~\PLlM^Ξxf0n|^s~d8f#Hry7Ud8f+29G>9 <9T|J-vԭT''AZ=*qƍ?:dwENC"N7dbD

. koVtd EV)64u '>27ZuԖ?xH~%n%/9xѝSzpVy90*+#FIM<F5}TcFs*lT':EQuPFpKiSIZ@kE}-!!#* [Ƌ DL`H!`ARш )`%ұx{1d? ➺6@oksșNܷ?W2fJ*R8?Ÿ wQT61 eѡ>yPi1EB"3Nf> X P(R̯dJrla3+R~_L!9ɮ8ƼTf!@F Ի?Ųf#uToHҢk( a!0()KXKfeXZ˫ p.B{eNLLzrh\8D3dT] N]BZxc()/bL{Iw:L#ȷ XFY>=|G"/P>k[8}8?79~qB8}oZ汳}P6݁үיbi!ȠKL}r0*B/ $P ڰsI R1>ͩ{qЄS jy@a9 M9LzDJ"ּET(glnYC{284Gr&-o'\ĭr깣)k鼯[w@yIcQsI޳Ʒj"\7''-`E"0?-P3ϣ)І?4LG}hpQQBt ‡{u򎔰1C@?BCOaBk %9jw9)riQyqxi9_{_D_2[ksuw?2 mnͿ}GI|g"J8,â!ٛ 2CB5VΔV)UЏf2kdVUl? [:Df IWM|;Uv]?VU26HNӧ\+>~ScA@߬b#\/neQ||e.ZϔC5:#M_zG:O.-bn3Kc&|a2,BLP8u6Ҏ仁>gY[G7: Vf?Do ص̊:\Xc=|2 3+jf.|0ne l6mF* g$|\yfk"f{1]Ш?-iC nUu`QXVq o7JQ#NmbHJbȐyPZH|B`$B…>Ho2Ù&c{E4 :Csݶ?|P׎>#0$ =;LPGj{Өz:V^@Q+nc!;?,ah 3w8-cKvA'u>RJZ>zudlg (\ xC n) ?l81),%ͅ Pv]cJQ_3"BΘ}Aa;d[|fNdlg6]jCλԼCOk2Di6;Oh:r{1L:YTdd LÉƪC+w1n'tlfwh%dp*[9q֨4VJE0FoxJrr6Y5JABH}J0L- `a!"OFBXBxS!ϡ@J(B'DÈȇ> |ꇑHu!F `ӵ1҉ۼمQ܏O:S.哗40И@+sOٻFn$W}YmZ;aeN dgelp-n[A#SIV1_847 0 ,Ùb^/ӻpFZA`Dw$v|Z N cFXE)rI=1N" s8.| ʢVV CR@,],^ē3"~zQ^v_x=m;Po%I\c=0.Xa_3n*#,[*%_ݘ=ݲFlψ߹w`oRs?g3 mJq쯔):!" I =h*qljANsFS"wq b "w\)˂68XI~2[1IF@&b2 F9(S)H֣ ф; C(kW8ibSЭ3Q Vf>4`]LJ4@Z%1) cZ \(C^B`@H.>ɽ>q;zhC̎b0-]7'0,Jlͷ(ŕ&( *Z1ؗ>[5)4Oӆ/1H8nB"Sպ5W^ˁO9.:ύ m"Z>Rхx̀DwEo.ǾX"/;N ,:T3AeΊc"N $?vr"(e!azU<ET] ry0uMP9)r 7!gzGv!2gvk(;]N{-(fL1qR Qm0d`"-(mp͘BQhi"ֶA0#i40\`9Fcc0~ i s;Θ+͹-G猟j E8E=BP2*`c5ZF  .($9Ua~t1#r% Ljs23+-hGq::7$`b_J"YFnvBkD Ž'798^ q }**h!H0{,R-LCW+vq#&MAX.5--hfwlZ$Ҭ̖4jv# U6I / 1BЪWyR%08#<2z M%f s³±jRE0 D{{x0)Tj)H2g(?0UZ{cgdդqih-?,Q~XO =O{AzpB:+F'BT/-rj8ƍx8Q;/Av|~Q6G.Eм+hpNXA8JQS@~ ՏP;[$ʏbvvvV;;aK ((ts$YRVӜ%гIQPWTih׵eD%Tb FG }ѻt"Ի:$NM1 2u+fAHR- (b[:.7a/<$ֽTz EE;*; 'Jibnդ`PG؎&'h]R6 Ie k[j C'SYmqP3(-W̵GQ'Ci%.ΏiDԄ;I:Ѽ)8/ N#t;}ޔn}7 CU^qeUq\YQmq>x#9udvH'O7B#XUvWg=E)j eRGoF Gy {[,OJtHk{CNkOir g-n'R^WSXHK}< ,<'\!C.C.zj/ltj~2{?K2Oy!eקҫ $JX έƴCVhIfa'-l-H4`bB3__{V}€ |ǽ ~ɒEgʹo&g>>`cSCOC1tJOk9=&؜SZJtSGgd5OsJX흣st1g(p!XGu0=V0Sr ;iaw. GC;aX(nO3,ÂKwhz6]- =>w1hsluhI}"},17z{g߂v8lh糛-6O{ń֘g+W: miۃqS*W;fwh98g}i׈LL*7Ȝc> 3>颴e}[Ff;.}!gPYŕ`K_6]#z$xE*,GEP9l8]l B(jķYۻūQr0ҌH0jD )GΕ)V,1([R!8KЛˉ(h (rGnj}j{7N2Z&,:@D!fOU @S:zF BN;Ӛ¦ܨ|_iE!||4\wKܮv9 ԩj0ҥ;bk!1B `dLa4p5jmT$vV@0J5s;1L3Z'߆Ear'g3&-(!fM("Wן;{w<~R3 n9%ױq W+`#F|-,s.cʭ;`LW1AR:w_Ԩɿ-sKQ5c=)gO=_]k[m(/?8>0>漷H}lo~3_db~}5vq*[S:vn 0= U_Ͼ\:ŇLGيhITܻ˩o ԪWqh{ -Բd~[_EM"d~0_c>įntTi ]fQzS7 Q4i)3mAu8Z/IsÔΪf,TEq<"eK?ĝg.ٖs?v4O{t{h 7C;ʥ9v|'IVj;*pÏǑeQY~rQ7 EUMqB y\`C88Q' FMO5HcaϷ|y)0'F3NZK>a&}LKUjcaI)6IP_W89ܜXM8=)+ѓW-w>;I5YDHe^On{;ƩE3̓כզ44c_0RTҎR7Dwmr^ ::kTunN(} 5쯫iMc `/p k-( /_ɡ xl%B襁#kII!ףUx <砌UwpքWr^j t!/.'Cy鬦B 5̗kEi)w^`9!z^|ͣR8Tyr1+ %9ds=E#P*9m$\38;N8L>j_; C8fEh0,(g; %/7KTƱ TLu'*_]S[jqLy6II N7NLJ6J|0$3Ol&qnXpo|%EyB$mCB#yog^#qx3F ʏtw8"Nirf@$n %Қ@3zemr%2&qx*7=󙃗P;?elUB͈aُ4 &1e=ykS8t0'.yIkc?}_dY̻|C q!6 oxAxq~]}CK;!Ca/3F%h7.`'E;%TfήY$Bn2@'& -YUmSY27iܴV>=0iJ>i @퉴@zۻtw3 *@FO~ v~~Yuh-al}@< G~Ͷ#c2zwya:[@eh97{y#[pr1fOmgjyY1茬)..蔅Z;isa,׳:_y;{?gKL^g$t} \@ IRh6,;kݒͣ,~/nov#K{2Lr7V.Ng1VSw) 0悔zAdtU) A3 ::% 1G#0E8CSHme|=t2Jl gnЭ 8 }fкEV?a3Bl wHKytKytX/ qdhAX %FS¦,Apl`\z4+>J\}b!nQ+!nB (M^ D1FsEgg(tFč!yɳIg0*7I٠p NQ+f2JEe V}mR}.3C(2/e좕ep>~ynkҬvzEO{a#DXqzۖk Zuܻ2S%[.>v2"R9Sn[/K߯.j\uKFW)] 刨PJ08ˈ$\ƭX/o[/,\YԸT/o؂^md;/,x?V,9cS\a$LG$b'KU)#QìWMF{̻+mxDw+ s"C73m1Q*P&7AF&gB9'9qάH7εk9jtoE@OΪO&\γ7v:gFwpde;trdsH_J@0n5,Fi":@:DFLdN=r6g$9^^}Twنj 8NbW<"-5gpMV08Np*D\*`X4, wKzo\q4-dUNk>ṋGfpcO#]B̢]rMI14ciIV1;E%li1Q\ p'md!wy.@R m uʼndd1c'V۫`qvnxagq&ȔE$ɧ>؅,Zͭ ƆsV|ݧO؜""q|ӏwLjetkl<e 5ˇAMN]ZͿ?sagJ?s71.- 0QЇO.O~LcK4ImqC)E `ik\9=6ӈ֟D4?c<$ŒtqWDIFGO$i-kz ޤԄ@z ro|{³vI`(+;Tp=iqԞyMǾQ{2Itkl8o?vGqyo/'mcݦ`װmJ#w )V_ܽs\6{<{1{jL594;U7+;騐Koٸ ߀aFtt h:ƲR]uYÊH6eͰvVPryf/?3b_yt u>/k؞ƣY+Zv,|@}Ǘq6l7Ksե旕z2z V -A$KdP۫kE-ݠBG(S(2 S;<-[CNz̀ܞYDžKX @CH ]z8[C~74qZDƺv]M~G2A v^*JdS _A RvW')RCj)9"J WHM뛼[}Wzy6HɓG^tP5^)zUt>L4W9j·6DLD/DISLI @VF4 =iL:%eSrL!CsքMԒۃUV pSV&5؊2߉/h0Վ>tfݩfq6^%b 9!TNqZi #/!5Jx"!.oG+(ʏo. Q{I/''fw1a.$ [Q.{ҭi~aQĥ>XS7p)J NT`=\wn]:4lױ:E\hE#WRA\3)(W_ZTRYC5KAkeNz(I{Bw [\CF˽"r ֩C\$w̽ !1!k 2.m{{1| ;G4O }B~_&]=T= _-#;%7trcvGA OLYDeL,@F@R' C)Z H|f?ٗ  !ƛ8,w 6mUYت #D0ҟRωԝ3~ҝEbD(wex%ɎmC_/I)wxB 8,ańI3&nLS.Đy1tZB`ESRK 2[΅ӻ xZ&jrs]?6;ZҠ(#okdg_Ň &8׮k8hWa1eCowk!‰!?ozy?$-;:J]f9.Q? _o;߄=̼k! EHrs]qnWUc\lBN +];|=Zdܕ%R[/waE^)ڰ7|$by4fbRnϊPZ2M5hQ=2VLA^#b4 LU+JSM?\.AیlQm*K$)"@'##c:{et@'LDYT,I,%"FzrwRGiBn'Ghf:=ߦq3=9j6{ET`R/LVB3?ZsMFg6~?ߏA.RԤ )(+4kwD9rnrUG R('J42JI Vx-[<|zpyeKE%Uh ̟*~=ҊvE^s:^{h {FbZާ}s"N|vmI!ji!ah IM5'm^}Հ_PʨSS>5rvnN󏼃 ɻixz1ʡ?l;~Wcs sya.1X~p:mKv~yAìt {Ch69ZF(|mIFb \zߥsJ0sJ^clƘD[)Kq+MA{I 33Ib1nQ+1nbVN-h LoܾEv sd7̑ݰ"F ['PbB qՁq4'-Q4)㶝/Ɔa.k~ㄮY% ]m'g ] S.L ,p'3U 3V8DN),6HK$` CZĤ, l3J㋾c*^ $yT>;Q\6%=^z] ӂsx7] !#2={{+Qaɨ, xG4 bPsOoE|-7v w(B*&x0XN/{FTtMe1dYcFSVvZKz};eEpREȼȸ_@FA&ʼnu{|)4&ag*O.yGG0{PPq & !Z;};B>{/+;w* ¡$ v&*,}+"N~C1d溆];fFP3q^t BFPh4JVۗĎdkA6Xcy0IJ9MimWln @H|B#%MNe4}X7jtK` ma=) h^;C CeELĜ*GZkР p 7%ELp sPմ qC.9>pyi쌮sߠ\x(ma j^/⺱qCRIB-e@Ʊ(]sF8ʋtJ "t_޵q#E/ݖؾ`qYİ| ٤iNq[3ֻ[)'H<b4UvJ W@MIM3$\>$;'=~ 2*JSY>9Dwi|Z/0DN6k` Nce8Y~j0F evg}xU %xri&M?[xf1 b~__ht|^z+wHZ+_-k~z`Oi@cZ2/f)X{$fnn2F~lbpDkV ]E X&(L{RԈQRvS2*v@0[ '8sk6x*B )@Z:K$8TZC+@q}R~HP-957.AJoBpJXo ZT)@5WӞK^O}ӻz^hcDv;7 zdE7Gӓ.ODkf;LkFG,H`sĪh(eȸ"!3²R4e|TjHQ6 Q&RDÁ(}HX5Pgh=E[45=X@ϲbjt#*YCJ6Ti\04M/÷kRȎe7q↊Cʌm^:fXO@+DQavF:=tj#*rz/8=A#0yWW:{qΐ .;<-ٽoYF=kXE]T3])u5<d 9|9íuQo~lAJ9A쟃5 ;};:}e&@xw/9{[}@Ļ>oұhzf*W,Gqfod*t^Vn<;7yO< @ጙ7Ƽ33o<̓3cL{uzx30s4ߤӔc!l3ҝ$/imކ ~{#p(݃N`زt* =-u䚝z]"]$=G^?=1f=tY |YXوgY} 2ZoTH}$<z@"n&8ix>Mw( s|×?I1vIkMH{jBus=xnpk0}{Cd7RyV(};YDY]9khtLe\qyfsc,je,4򝟴vBsu ƈnjd \j@•bt}yה8rᎋb)kP1\POS9q4DZ%45 WJ R5#,pvrn;B HB`hHji7^\븮-JVV%HhH1^G(-"5D @`P2m Bŵm-Y*vMh_A||Zi׊s'){fɄ\d0҄CrJ,D=yF02[CUD %S14Ki ~FEI9Hd$kZĥ;Ԯ7""&N{Fa:rVfi[JNCpT^>@D.ߣpO96)7U@]:=eɣ#\0@/BFs08QL#!]fWWىv'[A;ٚbVS鐢n+6nE$bo t(QyfTPjv')[ndžIc$ĵ@wנvF=] @Y´: jMONR0Fug|Hh-,k =XYFҳoSS"UNNнZ+NҦsE/P[-S[ɶ\-tf5Fp/G C .X\Gxw `(% P phTt K A:mx\G<gƉ.%HtR !{*~ߞLbZ.P*#ZcJS jYXBOfaփt3͏ (> 1H"mzŬ_l~S *ZJ;{/fP KTJ_T +rPi&$g.N S@Z2|G|dI~lly|T2ӗcÛ"ɁM2(R@aY;> rCbgrmR.T'^z+E ԣz~uYzf\o"ȡRFFnk A3wC$2/b$s 1I0jy4y"ׁyz]^7I8 A'DD/ZٸV=& wT:1@Ө`҉g-ՏIIuٞ]-Ow /`tW :̔sUt9(z=Z%Sv] v|_,%lW{|;aXv#lOAQC=(iAu=kCO!v7{{&;?kywe4f~:\C3`dt3 ޴qZ&toEgu/'Wy;rM;|w"ͥqZIJZG'!#?$CVzx_2Շ!f\!@qIrkDZYZd4M,eDXЋ?V@*[˧{_ϻhV jYGOpJ2Ňh2`4?+m:yd\"gQIj{.nD:=@녻z;fGjm=ڻACM=/>]|zyܒŝ\;ẾL$dt #C+*61$&o/<|NV/^8x[dIFovEӻM8/)#(w<%'|ۛ9idAKE `yTJ nk6.7pifۢvKϤ`A)yXPф #N#Z`EZXQ@{+z$+J]kUԈ9YU[RɤZ9!WJ{\26Wx2R/fz&J0Ll͛R̭ C˔4̢%[ERng)¡o8a75)({$=~,ANnN"uWʲx҄J%MTnNT /,P_ybCo~+3`t |y\rR'b&Q 3Nd11TxPvWRRR(nU\_;0c=yD@J>[yr My5:pHKҧ29NٟvtVDiމ?:wsY.e]b@XޓQ7/«pnY\~ _vheDUSҕ~3ˎeٞ}[Ca:`aո_c ljw [Ef7Afz֕b b.߻(wZi֗3}hJ'PitCu8y9$$W=W'˘T"UJ"UZG啦xR5IMfq='h)E. @ A䎈@TV0<"h.7P Z Mv NU-5$^ qCߪ4U$C-nԹp-|dos3p ɐsYF&ԃ:8PvLݝa1NƙR<w:ͩS<(oD&ߎt] 혹'$9C] ˕'9]FC[8l^qjKXz2Ug@U.FqGm`ɼ DIP6\4ʇ\<$.2\B1;S2DC,kyk8#Ϥ-i=\8*1f74XmV]ǭt'guj[WYOw YikU4W^34H_[_X>](~0×oVtQz#qgB=zxP>%zhl*t/ ~&Q;{q b=1z%>b~of&ر5rÛJ.|En{R7pet=^*+i˾PI]_6j 8Z'ThLG~{XN8d_i/څ_.c^6೶û^̠JkѶԿZag@(k2T zp=͌Rq~ kS"7@EW$[ %A؈_, ɝB&!"'.2 Ơ @Υ5ȝNub\F3K\~*->.;!&}yXc435io1{КTk[xގʒ%GMLE͐0R| @J(BX]DgA _8[ TZ\ rs s6DCykRCZJ؜05cG!L{9FAeA5TXO!ADShB# 4U@#m)"QOG3NC5w%I<f>dnf,leԌl! JOp\[/hDBqf UzeRn-ZᏢtS-jϻMqvFhΎ\{37 (G&X}Jۧ#C Ӄ|kH->Xǟ~~5+(BQ*=TEo_Fx{w}3B;z:?E:y][΍N1>A2 Eťy͹H4;TnEpxB0B*ޕq$B;S0dc`ٞYY-dd}#lvb,4 I.VE}őf @H]:,5qj 5â HIW[*/LPuڱ'DJ."Mhn횻MJARS.P5ffyk9njO. i,M"2_D{6~p0-~o[ề pYi R7:晽}uZ-!uPUD`~,Pu~Jx]'X>GTGZp%? [[쳉b.)zt@1f _*cl"-U<Ыps7#`:L)t : m<+]-I^{~~1L`$+r\4{oU_) ThzB&Zɦ w5;%gnU11(ΗZn'"['7 6%]{։mn k稬5k -:&M|wV˻n(&zƠ& UF&DZ9T3gvos}"Hi H+;O\3)Qk]BlWb:=xG"NRktJƣPyx“yA$ې;jqžL𸻹d&Az|m(gEo|y,,/؋I1%S l8,KQ&Mggyg$L(7civo)px{B&ė`K/tU]/8Ü=| ߼9V>^6 No>̷2wdo7$#ڶ}?yX0Z~acs>YmOަ?^hndEG~0F;!dg 2 }?gPF9UAS%\\˺# R8p4m 8} x$La˜r;.]4|ގtY-q-KIH:nSi6x^p ?bQ8 EțQ<aOh=$aUAS) FH6(s)t aYu~ D4Jv,_z /!z1& _8vIʙL2&478{1Gr";O;whҼFԷv_d_RÞg\R~:R>k}u>Gmj4J:.B ӳ ΧcCcauM]Xb RҘn4}4;VÝ9J]=X_xJI$WC+$Cuu^\X*JTs_# %F{'Xʚ|UL/ Qlbqգy͸dz: D;f)*2u͡W(b()hŹ`i*ټFfԤjF_jV=駟l:Yҕ!EMϓ0J5{qmy`KW"DӐJ _-~zNMy8]P"f (a77U+X zcz p%GWkx WM ̵nt;&&,:'C"^7רc _ƚn>rHmhNjqippQ?>;$n0E#n\N˄a%,)gLDs-7z d_;.|9b- 񈦚IWWG%{H *W_U? =".h[Yuy~Y_kvp?y$0EF-!:Wاfc*[ S.;m)RJV+gPkk/[aљ%ŗGK%2⯑|r]ொ)ʱʊԯ5PTJѕU_+5O4Z6\mFoKR9gibV{h0gMsNR&)C>a6uIj=K)|jQ- 52$2ijjq/s cnRTZcM>;@LLҁ5T*vJ58) cg#U񏓡#DkAQ=(E3+!N?BA sÌ|RHcu HoوOQxyF!a Yu9Q.-zZc +qVB>q"9 qLyyw'x?PAeTE!`-eʟzH7WY=!ܴz`/{[C]RF|;37>s3-ZL*,x{"QR\&퇹bB~րQp NMs6ϹN1|Dmo)Úzh DvL46`!)#,1g-mu@mq!W8cրAtTlfPyz܆a!wcg`xI}݇}kn;``]3ε%)S57",t/5XmʠEha*Nly,zHfFʹBdYHé T3Lcߊ0RN0'␖L3r:04)l=u +.W չt`Wˈٿ`Cq+ĩ>ie"P 2k^& :t } A{t(Ȏ Q:%A5[0) \:3,eA2%q"*mƘ)҆0Ǹ+'F/o;[?=lYs~s 6`$|H^7ػ>릌 A;=ILnEwЛ3O>?w{w7v8Ho SoTv͗W/[D*IPphey;K-n3] e:r2cR0]& Leqo!H+1pa_oD m`=[rX.,pWrx^P[DC r,GpqSjӍ)"H Ʋ6vc HC# 8bVsԒo.8֌=}?ZyHlsU0Z1`fN)A.mjL07MMGA$KzW*Uyp3P`DzOC`)Y"T&̩d"G~ % M/2 VN҇$q)!'X7LP%Z$Թ#S 8:XˍG]!!az#˜¦suJ`;K3 GwC8C J&RK;gEЃV]PfF.t`I°{~ey#S}w8YɀsY~pvwOfOS@>XWJJuͫwߞ ք }f8 PqA$j4h#Xi1`OzNN J@_RZe?/BGAv, Fxh$pOЅ&Ȗ:Z NeO.-dLO_"ts۬ F3?a2N'ѻƘE۬ܠJ[۬CPOt8}.LWQ㪋ز\>#IwP꼏0۞yy5HH}#IJ*dVI,7UQߗGfd *|q̐ D V\(7jgR1LIQ,L_;B@]r:_٘eʋ&.W =!ˮ5zp׋Q"F!dy) ||pʞEa)粚)o/TdI{}r u>pڬ%2Ȇs/?ͮuM󞶉XZUǩI?}Wi96+0ֱ$m}&f24Jk1Q]Vu XJ6}5tA:,`zTzߡݚiB9[ ->尖P&ibUԾúd;-[!Lx1<:xC:p6|_~=۞ۋM_"//jĔX}8 BelQK^E,s.~j~~*^AЌ{Uq24s [i%Ԉ\88o+ L&D쎀tp08Sya+HVZl_ϦG`ܧ#k˙IUc04N.yB>_ Vc&NRM(|q[_R|צ٧ʼn=a8j@ۛyn%'䠶tE,:4.;,mYe΂pR,G{8_ukhXi(a#arw96>DoIm> ZUdovWLT=&C)ăe.LX!Y| ߆,_"p1T5RTftgM1ւct~# uyPR!{gù:A?̭cە/-iT_|`AY'9; 8bȉjq=s\{\SIO]>a 8m& )4pp+s^Ng8>P՘W{q#]{q~P^8?uqnŹ]^gd̺/6RZf ~?8Η.|/=~ӑ[cC72i\'!$#8:Dd"2A[h">>?̄'=2~hs@Jjm<4-%fS8X^v%^SzFQ"JZVr3z2JiC+D|cZ鵹X2Lf[/^>o8e#RRW^m:i*2^]ƫxelhSx9KyMY.)adMH{ܷtnz( gQyM ݥDF)rvJ|:oשAޞ|&.F0~'wG&ҕs?h|⩯<)i6M$5EZM4L,Qp_K5&)dٌccW5 sil mԼlI$60hxDW69q #|)6L`8!3%K}H南S䑎 #V`ؔA/Я EX2+ ^@"i[?3Kΐ٘W%I.TjTn hul1)E{1 ' S^'NjAp2O76VYQ L L(q"peRRXBeFVޒ϶;l*xy-馳ōf#:~iT`bAF׌?RBз+2ji+ Ӽ0+T$pp*^O8h7jyIukC-[ߪ]!,}[v QЪQSu2s9T!*X,QbɥLY*dzcdYlIpH"Nx\fts3Ԥm _:}O=BBH'Obΐ0ucu&Dc19ګo~Ul,vve؍>'f`77s^*"oRU>+Z޼ ¨/p^*%CIuA(!@ :&B"NEǨ3BٱtxQ?R8pv=wAcPRisSsAx'!r+%U3?|ѕf%L1%'XmsZƤ)-]0Kg+K(*D&$:.5Ur="ĉGLx0sC7!̹*Pb)B1xb%z4z!)RDoѳ5nA9&ӈfDzxR,S*ϼZ*JY#jz:ŜܘZ\}Cg~IESoɷ<}u`Ok "M ~ELY@o;7f2AX0Iu cD/߾Obt)(*. }}gf4nz@5x8|ȁϫt^5:^F $ d[::"lR]J2@  o}q~|=Lp&谏SBฦSXQss/9Fpp M= %aP!/JbMc>͎tmNTN@D5T"$!gEImtQ3_>~0ϵlMJ@PpucJWDIq=A׌ԯ()`s:E|j~Sg!$K-Aċ^i  e J2q@G6N5.v!+ o]WPkgH U=e`nF8.^XY UEӁzA%OhTUO@9iFRA."JoWw(({wj8'3y7(Q2ћ L; &M?N.~j7j)P%9z]1HG&%!KdrS%н*SAnpp?\?OsJ(2kLu27{=x`&x x)6Z9%9ZSҏEP'P m9'ˏBd&t՟'F0o{"'On] .ڷڪ:J\QjwLQ% ̡P je'fb݈;JX.p v5:}vF 1V-j."Jku4GOV~pmI5A6"?ՙ"ό7A><9O&yB:dGR&hu;q2RFd]0'NYz8o|:2d'\d ~;H`K}4'ӕZ'2]ɿ?WubDݡ{\^'bc^ewق7ƯSe!:,9gusұ*!9P8r#)`EcNV@*`E( $\,4j\ksJ(oYOCR;`6&&Ɇgt9P# T2O켍;u۽= yjZy7AiwDfJF>?{H;UҦz;⚳>?[R&$¤sTo}os`c:X\ XR5>H0I79PlךְyfgEsהRA1oo΍mHǥ7Ylxbyw#{ٜPqlF"P)<ɴ;ӱ c%mzJ~ፙkMDrMࡲU옊H.5|7^$ӓ+2ka s_v!%?E7PE׽53rȣ9FЈ<|+!9z9ܝ,ɝnJuo5kLZT߃gBacJ;َ\I޿o50_zw.<#sϿMu&K؊Ll%ƞtcF.f@h¯(DN)Y̐]Jô5#7#K,+5= 2sn6mrE?ǘrŮнB@xK07ǻY,p܂1k"qfD8. r~t=/⯃k~_,.YƷ~OR7ЅF۬6ǰ ZńژUoQo7{91z˩ikLujw_VX[=Sl"-l 2#cnIƥZ;Lo}i`ϐ+VVs}Vg?)?{-=q)bAl>pɴ&R-C_Ȼ4929~xqƆ[W%[C[rѕuu7wǦFpS[agGOwd4ۃJa֯=d}Ov7HxGDo9գU;gT&S-C ׏umQ?F׉Une~8nV#+M]R1cs0zVafF?3XJJR0(TGS&<{l`& ڏ mK]K;cJ2fҷ~,Kb<8-(X[k?J'azL=8Q3I5j&#&e >ΙǑڧqgTH|-sr O:CU{nG U [ c3Mt6Mc [ri5MrUwڇ`4*9y(;Kg $ݏ/?̽48D)iڐ?+e Qe7o~ wۛ)ʅr, ZhxAFs<ŕb6+yk+F״ ;o4ɆJYoRf,6TqRLт@,vr{s89ӄNEq=_֭+LX_ޟҤ.3Kԧvy6HG n'wOs\>8HLHWW_j4Wˡ+r'XhrB `-l6773V9 Q<,PQnsa?ZZɠ2V0$6cG1HTziTqI;A7[NXE_̓PBJ&{YGh-Td X'AV|^`GFQ|(R D5HK!-T`ԳBQ':+&%v P͎ƽ23X*7]bO(N>;h?MX1 $H+lfʧJ 'EK7`Os\>Ʊi6>0J>i|YT-vgsWIN ؾ} ,2%z2 W03Q R/ +}K\u?ûĶ/YQY{AZ%r/.n_"%GH M:Ve)/_>R֗sHn;),];G82Ddp>SBHD*sBQK>7:tHgL1(5R& SI;,K7DHE~@ Jp\aM136bmX(ĜwbN$g_F| mXS[8IHڷeA}w3Ǘ bB{ 2uۃ뮪؜sn_x>Ѹ=AM7jz]D&r\0 f!:'{֎(0ڱ5‚ }qo}qLXc~iR_KYUu$U0NZ)(SjPQJf?Jb+ ۠tYX4]\1ڮwgr$فw ɠTEq*%rت]t 0C@Dp vV"=ӉS0$V^LzQhpN1L2V+mҺyBg@sݑ kj>ӰDb0xDld܌M[^{E):^tU߽ϗYYLK~ht ps7Z${:CNDQԾߧx۹;]VM齽__i4Q^߳yF~.Ԋ7Qs$Z=pvglJ1Uk%K=IOwZx 3;zQKr?RD8~T76sNjg]A/N겭@{PwJn;]Ni/>'0/U2Y\5i&.t:FEM'.k+}vVBLk\YMlʋD!] RkUOW2BO"E+,!F1J#0ETF"SyĄR"HeBO#JQNBϋ{Țe9))cBقc E8/vJqh-;- %!C$XSđ(- is~XjpևWZȌ](|~KO}c3 Yj+Y:ˇ"LO;O8Ir@D cE%L u NeiMJ-9S!·'J'lq >='GV}ZNo+9TaB@AZr@~QNvVoq1wЏ˿ZUtor*.1$e$q=*>pw㖗b1Y<)5]*L &lR.u25艣)%S=ɵ+ X*ƍYPʛo<'[oV@(5`QB *<6Ġ8K .C*kCD- 5AiPC\O:F x'>f3g/vNcF >ov 8w'E՜ߕؼ=ZҭA{& 0}DZ~,x7<}Tc7lf{-R_o ϐ]a5 +fi¤YY^LiÊ}lJ.Htx>2/?Ngwfe81ᄌ`zxTل]x4` e B* 3ysx{0!MxwJbw-c/(/p{C_ޏ3`񆟇W3'kp+9SDϸI%_vΣG`EΛI#/&45v2Zu4,J~s})&-&,.kF.T]W!JBT-vJ#faq%Gz\&)D 3]t Y,me z&Ʒ~ 8 , ]FjӢ`J&& ǤHJk˒L ,*,S{o=Uᗟ~~?v̝pu4xww`5&B8ޙS5UC)U qUP3U=R1ǡSUdjJŢDzv3G0Rb)-M[1Jm<ؽNAZ hB=LVQQ:*I6"z ߃I,xJcJ Qdy,#K TԂb\©ɫ/fr5ګW̾yr'7|)JXͩ ʃ.Tzn{XR녉B0<S%`q#t$ƺiVic5cXLZG$Ya8:QHž3Ʉ.cL]_ѶWʹvzkqBh_pn\Za-"H ϑ5LJA '7Ћ 4L`B bXqjkqHOQWuWȇ8$|ڜ/ڳ7:W-ȔDJHjg4WwU ,@1=Pb=Y|~*禯Y;J\|,4pu O RYh-L; +vz/NZYuAg)]vݝMRʦ(JrB}cRI2\O*%ő3LIڒ #a\ ¹#b$b!sKQYBrrȎ &gҒ{'[ʺIΖd 'OA'ZwJP'EȚE^Hxŷ*N|*.Syi &aPfa,;kk$= <^vHph06/kuۿ=Q|4soi3|C$Tׅۏܵ{R$. 97X/|( $־zW7TlӻttY E~!3X\")e SaB/VHӳ?Xi C}Cv,x{Z>0v`ʲqfm[T?6,>^5t& 5Bj|jʳ4 ) [x&DaKH j}}^*pmPۅoMyx:LO95VjAJVi.`?g GMqgj3KNu f+n49c^Hen{O+yZ1VUOwtxKkչrH)VMm4sfz~&1p/ah<3#Nz!PG&)/cGVH>+^Vk>x+$'uZ?75?frbp_ozhG(vp(,TC.܉+0j;reu/| ),\ .z2P^1\`,/6rI_np3ЗRw6䓏| %wd0YXA<%kPB >%=]Zْv9gKqϰ8ɲ$,tBΙDy q`]ɱvRg1 ȜdFiީ쀙a}fIy-N*U &k)cl8<2 1$=JrK J/F#p e*y{fu[5\w7״stmAPavhyjƬSujI (f+A*IdR;wkPni`,!2Õ>xe Ey'~ubpM*ދUo̍S4ރni6> [BRROms QZQ0iUCe).~+Y[הV ͿhB6PsV}hJU_hA:930VG6LqCl"Wvrѭhsʡ#&[a Aek~+ڼ6_1}9'),Ր|[9Sm2݋-9| wQ\Lf9k^ç$wF+4Yd M]׭~(ge;~Ћu \3p c9$+\KZŲÜi+if CQbj5l8zqswN>yJ;0稛fCUapyW#!C؊1#yj+pn/jHv*ygk Qt~Eʢ1,3 A|\؋gݖ2GvGvS&w$WݺKYbNA׋l3zEM303dD^(CB8{Kƨn w͛ׯU)X\{otX2Q$9e=Dї$ON$c=fm-?+ m<{Ws^N-..y IG u1l=$lSavS?ju#V;ޭ4bFFL,^t[}Uot6+~xǫ9Hɪ/Q*֝}@ȇeu@)ltjڡVqhPk-1B}g6mDm - *?j*+Ml}H_54;v8W]`T{߳Msg5*>hj4WQQ8+NQլq\wgfDy=2RvLSDzfho> JmR,Iۀt>2 $H?ܦ͆ze|_ۯz|rzտV+".?c3L34QdZ pdf߃N1:jѓ!tK93NfPe5v 8Ѥ9t:jQVfNcS*KeI8ɐ1O)9%\iwAɢ•r< &9ځkL #qA!ct?X' 2N<< AH3V%euЦD$LiX-t7!+M?G3VXn_RpG7w4J:|I 9C3A8_;%դ0GS3g(gvذ6 J5 ٝynz(RUzxONgK)Șn{xk!sʀƳAI %pLD4dY'rNET2[P!5 :+XIߧOiM\ZO$)iDoyљ4MH^Q&.JXkbJ!:О'dq٦(6Edp^q27(t`$ ʒ+y3=k9bBD>rbd ?T*ɬL<\/R(/M!Fry2i+%Cfբ謵Ykz:fSxzɡ5|,Ud~xoowN@, PfYn%WΗ #j^ʨJN-bKH"#9 7!(-̔,p<_4K.#r̛T@Wn-hKhtˉfmzt\(9ڪq 0M#nm[n.v'\2 4gr:-_[(9uClK[p[Vż6J4$FjIƋ#tbRԟ$$CƗLHky)d$V 8zE`-!x0bua5Ew- %H*W>@Γb yZ(j)t]Z(Y2~îSqroU̒|T:wwsS*v7+Dnn`99ZhF #3X%H=wo-jDE` PQSJT1|`ҮմjЭkX$gg\ߏY灀(4:mwmYFRkFShf)RTɇ}Me/D́r #3ޅL Eu0$#whKK~O?̶l!m{fz}%iu/F uLtd cLZ >qƄPk)+<~I| T%qcǚE!_]rqAh6Cnk3f6C?4MW U>tc>,.tqwOfq[Rtw ]z21?䫼dK)PW9tYR&Xݲn`"!v6\T֝2֎Bh.nOjz cjycAq$5|zlZBfH[6җ7ՒSTDm7h`Hk{P[{]F<9Atjw{o3Z [=}r֓1Tϰ 36G4v=:=/[Q bKhvRcnty+Z=8ѹ h?{m۷nbq~aF̈֠:H{ә̱a-Zaؗ 5~5?2}WK~׊cKD.e>,mK^>Yh|xw>[\}9=9X@;רrgx>}OcW ۻ7B>ͅދ~ϖ¿ݯFW|pj- q1_*t뛬w>JY6)G+|qEMLs[=yܒ T<&?c?v<73}~_,[/./?~>.no|CLeI|9s]=5l{KAe8hdns11l 8k./}\ߊ׋+1CqA,.'\J W[5Feq{;T\03nk/`b&V!l{6*Wmci/TYAց]rm O Ezi#5QW 6E(Y7 4(AFw̪̓ZX#SA챢6Zj<Xs&z0HTB*Ýk=7]b'U`&@]ޙ.{쌫"ÚSbSLx8m v©@C:D^ 6yHE{yփO8Uo]Pp>DCEz=TÛj}8ȼ0S{9jCYrxq3hE4J'edfnݘ64]Wʆc?wMo.}a'WY@v)^ly5 4}riB/4LxI}DFVu ^hNof/d)96T*) >*ɍ1DjICqmSyZ֭ivQkD:Aa||R@@]OO{(l{p:n kKO8T/ër&5QJwrYͫwwCw yB{[{!'JlrMzu'jt@ygaƟKxd|16ՠݼAȬȿz;e;ЮXqx[6"T5]<ϤK8-ճu+qlpj%&VMD,A꜍i?l"W|c [^Шp9 N;TYnI]3kE݁#;^V zsd)>+aYRXޕ[d1K?)}nYoƩoƩoWeAI/Ϯb6Ji8%YH*C)|Z ;?/bdɇOjqom\VkL#IPJ?%MAlwjh&qo6Gr9}fh@_c"!b>.׾ $91;-P(m&G𷈢%fS\JIo0B8D, >{}[h<{. rkE&")< 3%1io"]dZ$KfEt@f׍grXGWߕNˁV "r,÷)-YSKy|k nuDmCړhorw?>w~Pzjv~ggs7\#"imةxS~cV&{#p32룷~B_޾pX 7CL\g5zd *q3ccs@4)[ _VkM$bi;ᚉ(=\$h27ji 2<5OUxPYSО(OB{8$-^BZ$Tnōg7WJ{"XDR/ب$#?A\vcA҃a)`ٱbMQy1|Rz|'|>OAre9 3Þ%QIsAAUg *$ X]͆fc Q&(*ʂ,KÊAa{1 v.F-Vvmѻ$:J';/ΘypaA\ QDp^i6yMYzi"p2g֨9BcTBv"V{E+ 1*2j 5vjuhmܞEd^U?RcPMǠєɕj϶{oTj0YJ&QJO:,T:rˎl$X (8(V.#JM Jvg>JlHI3E 'E&9Xf"XAi `)v Cʶ)ҩ o܏uOO')̩&:)y8hȦ8S{t>8$HØ*nK=t$3E2 _,T5uF'˴m!=*|Բe8E1. X A5Yz *>b&("krFHɅ9Gf(]A()g F <\ɏW=ALf5cɽ"cw;qlV*޺JzZ g}kcJ?8(\sg1H Jq-Ym.q(2J)HB~1jMDFA |:++(7cxʋeY^t3 |~S|x~77Wq+Y;]mbgf``DZ EWݿEB0 ),`ޟ\Z$Ux͕!VE(ѣՊfz@0VuvZ^rWF֒lDsZ8VxOUW uVYLwl `JEW8g`/X 6EYۣC<#K j;_dw3|U,P*gqFָF ;G3 ?-kn/n>sL}j <LGӀu bd2]+NV d4V-v{VUp2bS^ktinUXktwV2Xh/.ŮMN$ԫ`̉ZF=ڒf筥{91Tŵdb}kj+IԷ5uTL`f`T&Bq˲bZ%faf\c&Ar lKZũ I2ʫ1^@xN1̶vJ}()?8鋋 #֦d;DAw(CH`&Bz@ 5A;zr^/)jƴE5H(W+rsc]Z'VJ8 `5kO#LyOI7O7YWkER je(&D/Qs-m#+|Y%> ̇v†[c@1V2#p,jR%QHJT2c=Q_uWWu]а%k`_BV3, K8`07!;vdހFd~ňө|^4\<ݳܦ|z7連(cR>E^\~~qF'4H{X0N}D\ _)7Lg <[ ;a0ܻSBP%y%ppT Ķǣ>לjUo|4¼J R*$xd@Rp'3uKtJ"V8~ZdpH.H|Aa?~X'sµJNB_W-^vP;zYG&Q unBbc}O02eccR#/]-K((HX.K~5/SBw.c&n6f.{IFOn[:mo)pȃ[<ήtI'AA:^^{݄kpun^u,C oǹ4 cGT).wH`Є cx$qZu_~:&bFNT!,IBMtGua9ا;f=}L¦% O>njPJb76\ݼ;WDU-eT2d4?XaU8`fCjpbl5\ԷWZz. λ|̫kJPQB/[TܻCU{+ցJ&]; >5crSFJ1H 04p '*5k 9kׂOB곆T[iteTsIPY?ycO%2 JFNGqnˡP*+e.#|N=_'!+@%.#c:, ,J`)5 j"%]ɔrTO"zYWcLaM0#$=«w:F߫ǀjI|@d}@,$*>컓sKoT>孵Ы}@'UZTfma]Vpz7\^kb38F>Z J"3FRL$M H'{ZbMTB–3\ 7!P7Tv!Zn;5ۅz-X .X[mRTj'MSPm#!`, XH;HPq tnR0rFr"px yX̸1r00c6ʔZZ~u涴5P|q55[Dr%v{n^SpG׿5e.|&"'`* VܾǘT}<,1YM@_F6˶hzM.Ưy hn܅4lTXoy |]WWdͽ_z ͇wt:ݻ,䗰pWz.V,Lzl(P Ie-4: {{4 d|j1b}45vb:.&9U0écR¤t٭ {œVHalVFEWS6|д&NE. &0/ VF0JfC6NLq6]r/ip՞Yby̓,\]6O$9MTBG*EjT5Ι "8@yٜB3|=?i]=+º MA%'Le=w45= u׷޼dwԄsI$ŲHd迆|,wUt4̧Z cOS2aFZcPD3t~F($NLj#z/ɀ,~s8 CI/_! $tuqxc?"q" <Nx(w@c9͍"}d4n>H4bUOm0ckB1AFƠBs!IMۂXʦLqѸ x 6@iMK'bڌp\$MDlI *$:MIS8A[oI!Zh0&a&4EJC=I)Ǭ#OkD UMb?M>6XV@zhl>NY⋃,9 6\_=l~m2LƭOt<0d>1FQ`0MJ)pLB. !x|w.l?^B<fc>WGAIiLh\ӡјO^ Upj58 N %-.2Jzk CByIjKPju0+KIHR  x*SrZ ˥(gdw7Wy?\ew|%wO_w KFnQC{`F;ۓt}ڙLC{fk4=ntZFO'"kŅݩ bE/r9 W[ R 'ǯ~VCR%hSNN޻u Lwxw}8$1kIJ=cq`mOYʣQ? 7@ D`JY•rDzT2N&hzM6kDYx{rW8L*~ .TH fyɬǠ$F 7Na8U.9㪠 z{NM"&b֥IWVC6\yb+-UۄDh%q2>& r~g]HcVS7|?-> bkYAb_aѦ$f!,҅OP% n|H @f<aq(n 8erP&2 z`Ih^f]a(Jnj))|(mw;X19xcMZ)0~x;Ɏ'':ovo=:( ɳt8zp;q߾{OߏP"^yPvK+ w/}`ﮗM%\y%3PWu8`ĕYoZo&>=ԓe6j޳*uܓ?s(N@(N [=*nti҄r*S& '( O|v%NT,*z8/Gi%{ϽtN6_mZ*aq[n'Y`T2?ܽ"\ԇ+ql\"ӖS#K`m!y*!# mnArJ7o͞1N ;&b*W~r>M]W Pta7K2ЌQή*$6RS R D E ͑R 62sL)2#IIV*S!KA3+'dȧMO{*aVyŠzչ J'̆d񌍚V,J*ҹx* =#71 F.ݔN8a>H]Jn;ՠltpܺ$vM؜!ьTfZ1Xh27DjiiHĶ'Y/0=c6HJuq:ϫ::kR{_/[2J5]y۔G TГ+fLk[sa.law8p^7xkJ.SX3AJ ݿEq'Sow凞wAtdn5iPsO[,aMEfZg$7H1T1@j)W Jy? 88%h;?ю/W!cO9';J76gϤ 9֢#_솄!B ZM糫q,P3J1Tf+2P$|lO+\kZ;9 ü?g;0=3:*6é6HwwT3a$.s%BTzS\ hh<_I|?㹶8\w2Q2b 3 3.ˌHFS`K$Cw$!#)s{vV 96 9G弗L(y)eRnm#$tkgp{IR@j:;I LQһ n#ŤH9PJ$C{VDՐE6 Xτh.>B'HuDEkJ=}ץeU {&_Bs,>d^ {w/ٻ7 ),׹+%/5NRΥPTXc#1"TsETe>/ѿ^:%cz9 8`[2O|+E' /-kC.:WE0UXaه*H4K_n%Aw% HO~H?'P~d/&zD@ HŠM^2)/W:.ARu,9G`(p%E!ZcpJig-(yUh 3w]gfDXs bT)-˜! ,\c \Nv I-`&E" 37MF;YVJ02wŎ;YH sKP8.[bC"wXJR!;@FZYn5!] @ +F, m,g4g%LRXPm(F|\ui _]hFdNTcY8螖X+Ν\r$JBˢpQh;k톇qG)A*G vǥ)M W4SiU N-r;-*A1-Rf`xTa9XeD̑0@`E}Kc8ƸMׯ Oje20Jr$¶-݃Gp!T4lSlW( p T6PDֶ/zj JP(`9 wm+z$49\r0hDd m/D#)Ă8`E>qlI@R*Y[* G/X~b}]"8IG :mX&.}s̤Q n7+ h31؁!LoɲXH`?Nf]y!L|Z>x+1k9eY W2-kNו΋VBg\iFZc*dm^8#ԙl"s/Y^|َyoF&>2;n[rBcs)՝Y#4;p ]lu8X?_xOF@(uqڟU( nnMyefv -%%T׬:^KV-AG܋;7Qx GE"z6t!"$Btۈƹ{h@)WRBHRu;J|5s7+%`eir zJECJ)0大v5ĠՒv䒢1΁ZۀyH͓شS LזmrA! _0MW2wstz*s8!>hZh/:̈́iU VӪc / XH>l O(/e=q=iFAxcˢ4Kh-'[n)q 昔E\j.,K"Jg1TKN8 z.aLH&3){:WyarF:zvOׇ{Z/*> ɛ[_Ip25FVӵ~8/"SrV_.,z\_|m~7].mAq5Y9< #N!\t] ;s7B !V ' }3ņgzX")O@ "s4D o26 ȧ>,WuCLXabX#mR뮛X6Tb]׫Y^A1 ;3"GEfyWcG#ٴn3n~25${9vxLq9nT\!"SP$0pG1b ̰+0=X1PI(N)rF+ew8B)5;rCVma}c&Twvp9خ.dCDSHOw&{DXT`znZeVəQkQHQ>m%7n_’P1G W1?u{f=(1\2~6S7blIֲG.jr?UXg<:[f6c\˷!ѳ&'dƴRj]1^۪PG.br^@وHq`l+t,Y1WZ؀aG|z)s+ 7QN~jxm&WkfZv龸./} ?,LW_nA>mp%`+D {ӑi*; 4%D k^%JߚÀ:vXJ6l0:mJ Wa@q^g*; "Iר]J(Rɻ8 Nh\x3@|lrf>.ʢzp;ÝT?F >kŭZNE,3VkZN!zQ= V1BkFweRNܟtyr;4[!js \feںLk#[)unwzU2BYdui[^]x.kxXT{xV TqjN OmV飢X`vyI,%T缐A{∺|D\a9lgv"Yz]{!kwS; 6" J6Gg'j/2*5BRʅo_N'S踇 ZWRF#v=~6Dw~\zbCh`ڸr"/2kI[P&rLx|~Қ#PXC WWk_L/@﯊UXH23W\MEꋣY|n!nkӗm(Ef4_,/LdsE9w|pI.rJ f߆/J ,~~cq]f ^*VT`yjWU WL9s5WdHUƸH6K&.cXd*j4n:xji岼qgv9= dzW:|v7_קO|].S p)@>i/}D)涸"]ϥE8&;v|8ɂ(o94%ڨ@tOҰu H\ SH& U%i RSf`k{ >tu]w0Y!UOw5VxW7_v}G-x)aAJccUn)?zDK̇?on]~b8*Ȣ$)„T73_]/URk0dpUbD9B8a&4=ߴܴ?՟׹E)QWOyٻ޶$W,vŀ 0 Ia*kB]BRq[MMs塨#tW_a\bK*q`F/O2`St ?izF}y.Ssb\c`5rGHX6ѪZ bהQ~Q4@LS>.0Q<-E)ϨiCoG(F 4 iCkc.ϡŵoxmJMqN]rޜH)fR%4&e:usrYUSt 3f!xRG# DBsm@U[U9΄"DV聳j>Dtĉˈ3 D DTn< ^jeWjv o&)xM&Qb5Ychj2K axH=#0ǜLG EPE%&NV&1Y7hnĐǨt KjuG?+&m*Ky &z9׼jlȩA~ !n 7@'™DUٮ_*yઌo}WٮB^\|MeW*zx*i{eCO>wRY*U5Z?އn1ר~ojo+jU{+B9:G#:yB QEO/}M%(U֋lܻ&c`RS1D{ *aqEI"zgVl,4rk$q$;˫ͮ w7 ⫹dRpe;^-RZ-Z5y8WbCw3U9HL ڼY>{8DC,/LU^{2ugӇ%!kᯧy0;hh[lhK0\Ӿ}\ 8`"æF7k*Sǃg=/Ssgd+i4֣wޫ'zS!yFYb±B4C!lt}ӽbOWAOqK 8Пrb&q BZ/ ZMI0̾ )-W2zO*JbiOJ0bǨr1)5ϔa<i!u j m+JEً)i0Lz,׳`Y޻%WKZ(|7Sq>&~yX.<`D kQE6M/wa|ftEN#ane7. N^K뉏,JWa^R岮RKr&ȦzJMQ:iF֣N-Qqݛwk\ֆ6)I'Wnѻ5 tQŻMKQޛwk\ֆ6).n)4iFv %Tkָޭ 9smSIMEM>aMO"=UZYߟ>_zSKՆaS&JK D0zO_%f}xEJe36>e9g˛pT|?bfۛp;HJ!j U}3L1%zI*(U? Juv7S@PL[ڡ.TWEu6KCh(*H]١">wH-c[e8Tej&-5Z{}˺5e=QvIMdRv8=FAvn , Ge-?4om֪b((_Y즾sҫ-dʮQ>ާ3z3h!%\/xqqnSR$ś5WU:Duf.$v߮ByZJP"._:~M.hCH; 1Oy,wo~$,Zh5hTA#6L!eMj'퀤š[9.&}t]>t˥rE5NdFkX ʖ_ckOg03(?H#J0?^㶄*;\ܯ"A娴Q!>cdm,(n_SpZsہ5spqr=^ K@˵_~X?f0 aqmݗ?ƻ_.foG7It!`h*rD~Lp?^}(Wv ܿ^" l2&iw}"KkhJ=}{tOJ9&,`Z]&f6~Ep}0KE)H3cIb)ysSrް` Y^1cLptɖyb<.ʀV]ƸIl=6(R#D( z‚$2LRY:)m&oタ' J?$b)~md;o FvL4L3 Ac$sS/(JgUCZJv:5?{O\mdZVnd}dxrMXO*||ɋVed5F6rvB{t -FK *eDQ41JB?EvЮBTX+ݧMhWЖ1zBxZ8 d:i(P_=Y@r(L๤vL؟рPt{X1 czh +Ǧrذ$2t8!F0ccQHGPH*6 *4FN 72<5v,IB2RʼnXp#BXoi1*Olw3%ybX\f >,bgşwvNCBHPMc" bHqU!l%^u,7̆d8 |58!֒/6[9ČqҩYCdΨ$ͨW{wq-nΪdl @vx_a̦B9Ǽ'w&PuҦ┌ꠕ,$`ΩB1d ɈI޾,V%O.yZI6Ԓ8- 񶨈"8Ӟ6^h;U󘙣@tBA)wxٟ4⃨x^S{mܞ<1iu C0w@1>j3)My3#!f7 Aʛ#Gw)i}AsIVҸLj%i+}P,&aH5[eykkg5'[1TXDXN*[1tLErV ¢ Vc'.64~4֎MsoL+g(/X&-y=YXlBpcpS3߯&8*Ps{'_}Cv=JC:-}/*[=' wrwu0un_ =3+l\ˑo;+e^t`w}As%E Z՗aKĥTCۛ}1+n25M`\KFf-3 =}q >}͗_ʋ Sm:}tH_T ߄YTkH`fI &W_bA F~=],ftM Gtk$!XfŠX[0HgIEiz(ӣds?m8R*'pMJV:`n<;90k\$;9X;փS%weBQ,qИ٘ZYd#jk#TX Z?{WƑ OiC|4zW˞'o(%A 'o4jB!Rd*˻2KN/;Ք(oR;Id47\dՏv{_ Sji.H\}ySrۗȐAvv_t؉ bŘ,vPn`hbn0a4z2SʁU T;%MRLJp +Q?2S"\$ۄz. QK !5H[ꂣ$iQ;ã|o"9z Xuw^dIsi;XRZM  v5K9conUy5 8bh>tBpv2"qj!߾䇢Yڲf#W[lR5s &WG F=nM qu7Li1G?rHØq9c(Um~תO>'0ON}+/L_gO|Հp\8-\0OtZm``N:^A?֢塝zUBw}ѯv?S }tO'c_67zSo!7" j{>"INH^v4V/::HVN ËNF0P]>R-vv>B.aTS- $(h}4?>,8F1WV pL_ B}jT?#׊^ l0z7X7Yഗj%ZjxYaJ-4^uyYK%hSnp-`ի2;WS *q^aڠY򃾚>O ; iv$4IY +-d^PO!Q2\)jprۙ,PTtw&ӟ5;ddawZz_E9&Ч&s]ksNB},`FҾ[]8cV_*7¥ B5TǏ?5qߺ>g߿;{g?y&bO>Σ0ͻxL,c%ɯ1@%@BE8ㄏ< 9Dor-bӰ?*@e#'LInԊh# e8#KVX͘^ȹml]k.,"sDbS⢦8*DŢpVDB33%y1 &RDFE[!e#vz>% FBAЅp݀ P9, e )Jb, %8 :pRI!2R%~P2ri%m 7 _) "ܦJCޣ!֟F˵5ZROqtC0><9v~ .aQ"~yP|z><qfd1n߿g=t6ķ0ϗgaEgx9=p+$ K!ogӻ\/_?sO1> 9!Gsڠrls_>dc͠ <"y]1|#| yIsn=ز%8X)# ; Fk }@CcSƈfpWe+8t:./y|* CddR ƨQ!ynZc/ &TK zzKhj#Q(RQ?B"\2[cXrGYʱ6jU(\Y˓n'K;߿:{<@HL``cNJJ;~ޑKla+,%Q`,uy u'FrEBlU~].(`[ [ P∐[<@3/ Z a91P^cԥ&E f1 &*KOZY҃ıٚ`ia^>> Շ&vt%r%Baf&^6 fx<DZfPBrYN3E3Xt1[ ~ҿwxY0ۀBΉ14L^mK+1R@'`KO%Lz#W :lʿ!s % jVlW3@ x(CeJ_/Px|) ~zq֢ LU3F bFTj}-l偗h00b9P$(H,;"Sv \6e6J [,LZ^N(%'thx D-.LwOOy@Q3Y\}gT0i.,'M;kJݞ]Jd䲸yMkU/8iŏnWCKE0csIgXztȀZC0UAh o22qB"s(%kcPB N8:0 ԏrWA|T,jik8#1.E)ѕ(EnU@|hZg;-FC[ o8gX[i7T#:`Υ>ۅ4D#ZMg WDRbt,<Ғj1[V# &D`j0vq8 %5t)S$+@b+JI Q345԰N3Q@1BjXvik`M4s·px),;H5`L2`R(, *1Zd R,z%Z#gQrfP1g=˙>G7ӽެFΈ(v9㌉C̤ML ݳH(@MJ5GxF+r2P9AUBDJ.arXY"\iMFL@uFH[EP/1^' Fb.jo*`֌)q^|1̀MƐx:ވ"t{(3*a TiMTqHYgx$cy ܺA DKFBv4H!yɥl4!󬹖Y9m@ N(cP^zcQy:&mvHWd_5R3201UTNpK @-Kw?_"Lm,>K"rˊjpSڲt~E[~c RYN]*_JbI5?iAL V_W?i؄fCo+C2Vu}yT7Ӻ Rݹt3="=%쇈r!W_l9rf- |sw7+[[=togf7if=xur<њWي=51iA}!ތG?3] ]zaŸ׳_[B.{?=֓IPT9Tʾ9.X˚Ggd 8G[R [3{`Csˀ1> 7p&C齢 xkQ0qVc5@3O~=A@op:? -GVG%Qu'hgٚ:|bM?뛯i %W2V Zf/#7U ?kAٟZT)gkvWNz 4۱ǵކBd8Fݫ&mRݥ0R)nd[ݩȀ (./}瀔)ɀџItst۫`oG7/G|xpEb2xq櫝,ҹϒykTI2f\|?fγ1ϒs陋RRhsY9oӑrm#Sȃ_Xn+A[[ rDMVtW foeZ.$䕋h+GTK '?jTv j-N>:9ﱦiD~ N ftkxZwxÁr}ȗKB @ /9/RT.ht{^3XmE,_*~q$쐕`ǣڰ҃$"޹HմH.Yv#T,lҰ wg O?V:e{ty8pNFWu2򜈤?=uՋjEe@ȯ2m}uuk=g# e_g|Atuzko9;]@DP7t|蔊=˪ot)PZ2չ-s"9Z&3ՔUr?p40BKwk8wR1[B.L {Y r|NU0D#Hm0yKC0{ڛAv@ŞLvLuݩM9&NiEQ#T@B$4=Pz=U,p_J@ʴOU>TXU9H6[wڝjC_ܔOL9P )[A)mQ[_MNtqS $;6VY"+g`IGױJ\90r۷9] گkCvjB~6:^6Uw t})Yߊ`T/ †ICzkHUrM3FWׁZ\VPtS. bZS6\|R-,""zϋ=zn c*$0N wgj2=SӡkC<Lj`QB@ ROeAHɪ0#gZftS'e-=YodeqeeJ>d՞9#ޟRb~op3k8"fɺKec3ZJig{޿4[b|b1jjדOY[ɁyxE/[$؍XNa#CHە!|&huh,L!偂*.Hy8]o^쒓1x+_2b̻Is PZ x39$Hxq>vR.ٟ0Dt$&u=!Eɡ!fZ[94n4@DBλ Z2JbxӂXi%z[-'wn|-p֢HnN/)>wtOŸ;F$#jpn8&ڗB mÒ!4wX=84Z S ȭ `"+¥Ȗ8U*% PhEhZ\PE#bNAUR ~sFK̝HYJÂRJ+̃cGs(xX,q`1ECbwkeVSt/sttQ ܢ[?}Q> W_=tC*BXt|ws_}ɽnX Dz?#ǿbEo?wfd6Yܥ燧 6BQ,cJ7 +hofKnņ6 KE}*?,E% "*wSobA)]suأY\^6^s#)SKU׃O Kzq -t0x&8WG- OH&pΒ7ku9vs4fҬȢ3DBBh +b%EzY@'oC J;!w}qԚXH"Da֕ T"-g{G*\;n$ب%)%VRbvGՊLE⍫,OQ礲? [0t3i_$}  ,+YcA,4=5%֊:e gb֦$`iX,dhd` -\ Q¤LmXT\˵Q<\}5NF1AZ9?|߿;`ɯ,[dI08UCF>CH; 0EJ -%-o1Yk~rnk|x #s6$Hoy(5@*dVX!aM k@ J)W*b5˦HƓ>m c `,=@qWv z-}[6h<*'9B[iAYH֝lOHOdIO69Y-7eڃM[HUzTwɒ,{# `aqDfVX#xK{03L Q @%o%(;Ι#PwqòWHOfO69gLN'Va`z)l}PIaЂ ;~Ի5aqX,2r#SVGiy >_^||`ctEF\-h,{x2x nF}a~3e ~S:#zp;ju" ~3S&[;j>/XלV.`\~`)/?M dL9JMUAdw֧F.GMsn k(Aո_$, ~[vX90făc[X7#IcsZrl\mO຾OéVdtU-Tu,-kCUYQ9{lp%ЩdSD#c:%džgֺs0 96&5AG9k"kYY[΢-BBq"~v/O><:>’fkfY[֩< p*+k%.3t8ISoeR nt4y?#WLF%Q5^#V[,E-F&Hge\Nf{,eP-oDZq3B CX hC@Nx,+1/FQ# Bs-fhU]:~ Ab 6F KM2H=y\2X:PI)vޕS,qx`|^| %n 'tdMہ*[26j{ѬH+JTҘAVbNjBBB) Cݍ! 'l78lO]aaZ}EZz"2XIs9] EQBru/1"dk4`pN Z ĸwA4JR,הd [4z h CPFp$Z 0-%moC 4)gHjT6?ҹ1RB1RZɥ$@VJ2CbĢTÚtհ~dr12w !o) *Z! /DŽ<t0WȔ{/GxvIiO;kU=ۤ,OB1e%T'yp49fLku$|؉b{<`OΣ߯8I'~{@_$n!0gǽi&idtن,8<9W[i_qg[=38=8cX[ L;x ;ahx~MvF1bxWLS~нfPڑM߻P1p +V0I]K0ʒ[%4q Uq鰶N)Ɛݿ~ijH|s8>1+Rs\Q]LfYl039f.gpdoN" w?kJj 9-.ARiW &"CYT# >r-[qD ʒ9kڹf7 |r Ei4SW?S{5S.Oq% SZ _rҊگ&$b4RF4ީs0U`-uӀ-bx+9[^z7lĤ9*YD&8hRBJDY˅AjʈKxw'iTKiTx^'wyE?S̀~Ģ(qU(7s~#RD!0Ish/K cS6ҝhٽ26%z` K%Ive`q];+Dۅc6~8DF@ 5z 2 '3&l]ЎKi-Aq۶.@DjA:.YRai5xI7-U[Gw \䯐0@RM\ʐ#/DqvhaZ]lo +\DK|fTsjYAPqgtДt,h^՟ex";[$]~a.,%ZHFN0W6{CP1C@ Ab}x4bLQ ,*/)0>g%eT Ii]Is:XKڟ{i\~)`|"EgޔjiVyؓ͞)oְM{<[5` g5t$1f5׃Ang6Mo%D/%Q9o 8c`_S9>l|k7$_Q"3wOEcMl|#dQi-hnu>LT.ɸbv zwJzV4nʕ{GhLk7vc[.)GvF8ai|oDօ|"%Sy+.$cct&qM9yG[76i;<^Kn?V?VȔf/S4dLzQv1Bm0ϛjv OHR1.KCMg&~rrJ?[lŇ?+:Dfӣ_C+#?Lxm!ȫ@@+[ &55̭(HZ =xنjNcڿvn;dsSANq IGCZG$[ 4'f׎>j[2"[.S0a )C=8H{S:f{S.)7O}qo(1~}_f4r E4KVvAb":}nP#{ڭ E4K4~vx7$y e5}8J"lslۙց|"%S\@hra$rg*|g k{A/}z0$!x߾#ɢoTӧ{33Bm ͪ 40';j._a%2w~i4̖9xSMk1nO2&?~Ui>vz40fuŎkL z%^+ cxr-"Cݍ✆5XL2GzSBv΍J2~!kgΎ,1 Й9GUuEAV!ޭ/.P\ ijKdg|r[_]W?Xl {L̃QQ ɻIN'V#͎{'~B Jj>wJfEe{ƮȢ04R3)LʛR3iim=+mߕ.;`> 9 q$*]SA&juLJ4ÒX,t\,b:d6._&=Z,bEFW"pvnoR DzN ǵ<2P-޳ʩ"} #P,TulZ 5{ !Y&ӿg2AF%9l;ހ\0i4u2?{ȍ`lŀf``&9fxl$L&?Ŗldf/j /-vbUK^OA̰ޠ+Ls%CNJ; CosH(>4.5CfH$گ?Y9,3Z;e- pF ؽtUʦm_TWT6[ED=W=Z n83z[y(;U LG }}k!ث߮.U%Jޱ'ٱ3˾jC^?Q/V0cQK`|Te^; ,׫-p6PB4^0 pՀVg h;(pMZk(H)?_PS%vq3EKVoMHs-o]d->\\y /.(@nڂ^H)FfCX*J0_nׇ$r@U?C{wgAvpQU7K#Md[;۶4"t%{"lA k4 CqKMIrR/A>ρ%;W/n+Gõ2ĨP7-@+ɡiLr2ڇs%̑i,]Njfڋ07*:=Nt?=۴ppZTܷl{vJ eN,Va2ɍpN3MŠHhћ4u m{`(M>I.kh30k_xr7ۏcKJj8emXV?_?5~$~ 4tV?Xp$߅glz+|cIgx4\ꇰnĭ-eޚvdYN_$З = 7֙L;vl>iK$w=ŏrp =_'"')&sA%吼.G: %v 'd;BxZG>Y GZ=h'8%y=AٿW(̖-:66?s}d.( z'W#~~29t_EwN|'?G7C1T)dhMr9zIά29rFZϥKfca> O_p.Ȳ~ƒ/:փwd,E6;A 7L/}_F(%ׂW2D Qׄƥ-ݯpt.BE9਼8ט:ҌD y*u*MX@BQ[/IŚTj"VEZݗo/hE%3&Z(m׾,PI.feU6Xg? E}Rf?lfw׍o{7{zޞ&Vhޞ&n|32QɔyGC}[ꎑmBΥbVdZÍ*pK 9~Z:+A`66MD" 75LtHかd'O"o|we}P.hǘu2FsZ OIR`Dd|r5JO> {N3y fbsNRms&2cFhb򆗊uOxK'Wˡ1fkB?62h%+g&{d\nX)IxY rgs)/%<195.XݞԐ7J%hƓwwWWgGW;-/*LgS~fYz pmqӃOg 2 ъ5Oу>fP4LYϊ1yXgo<޵  ߟO$M3&pj%fnI*]sۻҽѺ _xL&ןHq.42HUӂM{8-U!j1䉍]r6*jdLҦϹ2 yfP;ld=fji'@~Dn ҕk̠aBj׿kņ  ܇f+p3˥MK?Q^A٬A_۾tBo~x4b^/V} aAhtP+ ڥ[7dC2zOU"r)<SϜ.a3I'4A>uRf.u%O$p%_­gݼxQdREmSZ\W OD)e6dud T:t:k%S$JDhi$ǒL4Uig\P$o~m?9C3t=[~kZ躝erN ;-V?}'r ghm{676Q+ L*EB(d>`RYaxh5nvIfXf NQ~U$Q; sSe*58$8QTFIZh tce%[%I>f9sƨNx"sR0MLPʸG|"t3Z m-D?YB}(7LBtDŽj;(̗(qEۃHLVR #2%_x*x&6hm[=^%ĕf@(xG( n2/vM a_yȱTJH d׸դvvO(2kJș.-ݞPxYL5t߄'&D)՛jX% &FD։IST()BQy 76ByNsJzRxeKث4SPLjq3!5O1Dl2!@3l=1 %Vh44c 'OiMvSXYβD̡\M 32RF- VI#R0.OR1P6sa`[d^LUo.U#VI< :,4ۑз׎Ǵc$B(΄ZǭsB)d΅+P İw!TzX31ˣ_Qrah~*2Hk{TP)5V쮨 yY=Uz8ɌFT(e VĨTuVĨ-l"FJep.Bl[DA;tRnJQ%s:r\"& NOI˒L[ :sRsc 9@꒬TuV,mVjlmU5~5Z9t"H>גguq)y恥yE¥&j%:q@MfQ#Rj{z+߬!ZPvwjl[-VP:9_޵ЉW7 S?cFa~?o]f.a|6ߛEYOnYvaYl/#b?o8/~Q:aO_QL%ic,: XsJhxC֩0p-Lzf4G?-rvfݐr^$mؾY6I&VU*UWm WKrߪ%ٮ2W,qջ#I_6vg`l΄g0`%bMF,w!D*ˬ̬<"CIRek5HW,Ѡ i6H%`_Ƹc_eϮ YרbSǒ^?k+neHaZ |ٞ9rA,ҥ̲?aI-ZK\3ñJ(Ȏ#e<·+Djɲ$c$Wv~\ClVs`sxE i #BLܛ۪|x1,PG[TeTaTBڛo4E_QF(ɋT8B) b-ռ<ǫ$B *n2R"5ftom6T)$X=-Fx,bf\-$ ^xNaΉ9 2sk+ [+)o)\C`$Ýq152+RigDr, !;;X7ub*ѵ "ALKZ>Zk {Zk΃O*5eDRNYnKq1(XrCT%]P,BVˊէPjt@C;Td*XJRU+V%(Xa{ hh&LXD3d.Aګ.x1)+\0eIH߃R $#44pٞ9^,lDSڹmGx1*R?쎱f~hXiסW usgG諧?6̒ϟ5B56Y3&Xt0L a>3߆- $!$ᢥb)eeTòWls\#WՐc7 }i*&6.qP_Vк N\|+)Z <;] MRWG7:t}b| ; )FidƆ!)";5 a&I58c@Fy4\K!N5j5B9OTPsw9ٱڔAtsn{3 TDX!ʜG5nȣװL#_#A1E͕\kY .^"{QpY >0h׸դ C9y`k0ڬ^8w{sCr6B ¬{ĤQ K= I5f[YV__{̘y.߃l "K>d'{PePZ;`\ =A\+0Q9aA@!W5FOaٶn.sKmaiA`pn<`@1hGuC\!|+Vf#MpGΥ`nbcG‰no^|s.kFJ!1Nu2՚\K-""ג:Bۿ~obc*)%Ś&bB ^[3'Hp n؀wqܚRNQT:'0g}Η07kSjPdd. eUQj^0ךY?30B7c*M7i_hr/wfd>ͣFK6t8f-_*VhqY}w?Ch2~=2O@6+xa'y*Va''aCywz*yQ'`NeILQ eOéF_H+/jP;L.<rCVb6c\/ MxBӼE z^Pu—RlN;32(煇-Nw d?$i|f0~z ]I\vڔL"4T^@Jk_NQ[y=+d޲?2Z|oMioB-UKPy-v}kkH/.ffhO KR[MfL"FB/σ{-dfz< bx&~3$Q}9/:X!~aQ}M/u ƼoPv4 D܌rP@[ E|R bǞ `hû_=a|?;z~p؇|>d:~6 /:`5 qAlc{x3p8+8{Ye>x,[M&dwɮ'C9Ǯ Ҏɢ>&YB8*?V2 *jywEvL[4鵛WxCEUEEuU]TQwEЧ %5!yZ 3"0đ/I 64T7%χַiVޯ;ì n0ߦ9|מ|C>xk)^}D":Fp\ 4H8o fNia:J)J-Vݪ;@d@Kp)pietתMKP-9Ui4kB%)k?Gwjg젉rnY2e4+,ـ0L'v!u4, lSSO焷77_#߂쪛9va0uσ'fKF]/\]2ؐ=>, W{ԾPCb`1i)Sg9iB V^Αm5y(v6;mU  Ձb3|^.9&HK.$^`3e{sǐg{'QN$uJjoDRN*VIerZOYỏWSY8J{|}:!$I8$B~HM:o*5 =sbnt &Iχ1ɘꑱ,+.%=|=Ƥ6SkR3F[r]_Wz2ew7MuX(QFތ .t3{Ou] + E")JB:8,%`XQ0qRfԥ_P\]Ӯb=JM0VXDViD` ϹN)b9{M DtTBmj˧PX/D9Mtb*Nw+jw)\ {/-ZXIw6x2$H^탒]8QI}g|`K&Pa(;b5Il9t|{DS"kv\^1O_}Q| )|"UZ;̌(6 q&&넕5YkhIL 0@VbOEVS9}jαv?/O\/?"H=yW楣糏bADddŐ? g𷟿x2]pǗ |ng`&?-j$~iO/< Ep@l?`5 #pռɃ.qT @YBw!=LSzAEsM]49wESA:Dr6RcR(VDJgZ.`C$AD:hԝv2&; ~I'J}iM W :7[I/X`d^EY Ha@A`ٶY4sY~șsvǞt*Ⱥ(*ʻgD*g{%?x=S{^EtZ| W+5+u܅ՁMHm ^,8P4 G8 j=y'BoD4J ;5;6aýSFkW@a ™6 szzj?f -r#`U41OOa mreE!v*<% (N`To %W0ΛLNBΈ-4:BWɒyd)VvR䌰Qy{хxsSY7uRAj^iD*6J F({m;h8gw9Zz8ZjƎQEz'rU+HdRst *B{oE<1큃W Q:uٞ:\_q|2wocSN9c~:_ꊹa ~qSeFwpK>=tΠ**C#E(KJ(7nEGQ˒b9sEa?x|!J d8ѭ@֣7Uf6UǶYH]3N}FꅳVOWf;^Fl⌜M\[ஸW m{qDh[narԮo?6[ϝBvҩRgD9CD\ᖥڷj?9}2-Z8Fmy}qu~6L]Zmn0/iNe0[ Z+1jp`Z\ Lu͹݌yoM?ҤI=o5<sx}d4ZJg k3ڽ>+>~'ryӪ,&.z&aLY h'; U3hF$2/+zYee_ يƊgj J1 馰Bi6rF>+4sҷkĦE#L.jJaSltK<ŷRLgjjM3tHUC4x7#s&!Ayv{ ت\< MH>q>IAWr͵Hwe_~/\3iut~ ⅺ7o߄+Ս/м0^꾑% N"^ Su@%Qa(vNvܙ\yt0Ho Z EOݽ6?wt{Z6_JR,gBښ/m#pG3XMN7t!J^pA4 5u?=XAZGCUQ]{ \Kj!-A6 X4U _:pTi'kV)-GUK缪.HkXԛHYY]7gH%A0>TPTʆR!d|Pk2Wk<<؏-C_(\mxi ry}e?[>sD>H Qy_^jaiA[[͐P9B H ]5$h{ w_.~]5`] ~n6/{XUWu]l#תi%}v&am^y=o޼l^_Uo7~d&<ͻ$Lm,H-i}R(UPJ@KzE6+p|A`<dWzkDAtj$M҂d ٍq`rVךo>x2Fxr cr,f"`1+yٳFY"8zZT.~ml_nlo?,qp]?rƀjYg՗F)q.ͦQJF)<ň&)$QunEw[աG%b&E}5&E.{Rp6/ne 6&tg!%3;Ȩ>r N7[4bEAKFAsRo?0tt$JϊGE 8:(ypQCoHSij Zc!e UQ&Цb0-m4! AYܻgs/yy]JۈrT zg쑞p/vcA>Ƥ\0#] 3I&jwx*K\㙀m[#h;-՘0-TLN;6{3^NJc;zQ D#};zщ"9{J“|fY@?t~!1jheZQ?q~z!O7Jxǜ!ק)gf'oEK^BbBZp b+UWUs+_n_NcKr( Am-5ezu:W|mS([ r;T/Wبw"=ðRCn/tYt 1p:sOez.&HW ,wW'00Ep_(RUF?7mm7PRq2>V: ]e'(HW\stM9kEBѠ(i=/DSB@BREbq`Ur^XՊvVIUFmFqBSQ=^NI"9iVTD$T9-}#5IiHI35m ]?ѥ̙5 cb~lBi>eƠ@,eBc\ E#88${Ώ^ч(T<({ZX[{zQU5.;j>djq<^e9GLG2˛3\"xwN:q U}Ot[m?_4cooؘ/r>+*-yH{}|{o|/CArɨ$!*%ͻx`.^~'Q2ϽGJx?XT^ܹ=ӤZ /⯤rnj_AWP^HU.eOu% 72gs!+lFxAvQ[ W5b훖;f-a[>y@K4 [-ez7 'ΩڤD 9 ZP3㶾%z_uyz; bz&}עގ%zNM#wՄE O/zF{h%}0|=:CrBUltJTu%pB174hchjlMo=ش>h -"%@ۦJvh%"ZRo#I_vc@u>Do8vv1"kP0h/{ɳg3o_qKlʂT' =y&V ف[:t-4E?1s((F~nZ;m(zƖenިR~xL*lkFKa>m0IrTdN-C G8"YP&:+-P-bmi6`La!\;2˙3=̬@20 `G~W(BSYʊc׶I(t_): |$L7?4 %'WYI9Aw{ -+".zS9yZdz홙,]!b?>>Գgn"O.PY;3&L6\t\S|^\bMJ1P=_D[nw֒vG)ohwToɧ]r=67TM$NɓKZd0$eJژ ::LfPȨIAC|rA Ĵb3]%ca+%0 N*=P#%K8%+fH1x+{浞u<2EJ8xa?k, S .1wh?>WXh  ]5WQEf7c?32 *WqwUyw\NǓ!,Pə#ڂ2WT՘$P&hJcmA {5gϏ9|pµAӁ]zfam8cx ݧ6\>D@HJ۬Pr66Xӈ{$R$ ̱81h0WRX e9BIfkc#dw%ZyOo .Φ EOگb0ԳPai)3ϴ3XeG弆q`P&T&LUzj!vQ+q+A"23k`YKFթT7 c$Ҟ\W\*!mI 됲xҖ9)ڲIɐ ;sۼ#0ʸ^cNmhUk&*. FX9fYwKyز>HΝf@;,BUU$O2{D䑈VXNQ[ZPN![FO>" 2dj MR)ՇkT4f'UE sVD^{s3ޛ\c0Y z9 BּCb$unv?>x 72D}X޺_cos y_^ߌ]^8/i>_ì織 8Og4Z>c]GàE^0"L'VGGeTnp5& 22bl׹ *,H4t4Cjoͅ f{xm9W, J9.tnIͮ֩It#ڡ2< 5=N #B%.Ejŀx/ yq@GcU(8ϷzX*8̇Oa+\Ɵ|pJy#$*4V vcsS^{bxpEY7$ rdh@Y1֬ !8 P1($xEkul"ѫdѫ%QgOYL%0)Fj )Fvoh"'^I_[|A&O$ ~H@8גHl~i vH`0`ub\MVc!#ţT SWPa>�XrlMd*ьdg$&1n $>s mpD1U-dT+3`NEHdfJDg#a~%0!%Rr\j7S8?VqiIR >RE%X;'1'TZ'IA8 xwTK8p#"Kj?u--FDRP+qu{E/\tOG:0E@`:lOwv8xFj ϊD^1ܻ%4-&Ȋ,HHZ]oHQ`JZEr prU|۩fPi\ 7.bC18pFI#`v:˕A;ʬ>ZC8lE.~BM߬)oA5ģШ + ϔ;y$h3g[{e~Ìw=kq.! G1w/B鼸\Ety^sS>O)arEQ(>`~:l 0eR,_nwfv5K6xگfrZ--Q"aՓ9o'8rǚ J=oIa# *Q; k4ـ<ƒ %% 'a'XA%5֕NvFIbwFI|qiH㕚 Rg0 (K:я=V)d~!9ɛ-Hi1wYll6U7c WA{x8btͭ6I%v?}p.&1.QiB9_$_h׈bS BCp@!3 aU7 8E|5kkBV@U%NC'FvxM5ﴻM"HVeO:ւqTS/xHGͺHuf)͑qRTkqشrjYi+Xx4}_Ml+m O dvfg;q 6`+}c bƹoc ǗQ8fr׉[ݕ\ ')1ğWm}Nš׷⦬% ELunʻQ5 EtQG϶r&ݚ%j6$+F2P--Fmďݚb":MǨݎEPs8ּBvkCBrݐ)(Sr$cE3\ω(Nos7LǓŃxP3̯/^}2W7bW%bk<""1/7&{,1>^L )}MQ/6-<1r4~*>F&Ռ1"-90\*K*Kf?<1 V w_ra}T|-WQWUv|ᴸw?n?Q3G3{3/E6bJvd[la-T 9*Sژ)'/ePvpeŔ|Q]k|2sPJlr™D'еD9B|YbQ,$V꬝JJ )#&_)},(.}5R=KWX?p;qT' aDx.c_\鱝tHa!q)I^Xm\XBH0cEj.Q B#C rW6 PLu^=ݤjUl=[+ g %'V~j<.CYl|p /^ ;˴H,TFyt ĀGXL[#Hp)-h&9Py"T?BK'ҨQRC܈J`OBՅ@wSJu1"T LVc 2F\夗aQP˂޵#beh,4}"5'qlgn߷Zr2ld) p *#r!+Ё 5)Ydb\$tF^.B߲nKY,%0i1f hGp.ILvҎVrWZg Jt @8f;Sc/bizSz ч)!xFV{$W*HìT-gi%"jL%&X)%#b2z@ƃl5h8sؾ(WvN:?ɲۖ;զD=Z,sWP Ssܝ3_×Oݕ1 c0zgVc?+z/Ǯ;1 pv xd=P3kPyeC$#b'oGڹ'w+FYgy'M@-C^zZv@Rm6j73iRlvb^fld CO3L1kЙ6zn8Fs +_{ƒЍBsbfajʵWh ,P : }K9LMpE8 H=TCdgbGcl1P|.6E̡j赎9rȢq(\4*4֜km47W wV5|?EvGuKdvGUln8(o3ѽ";76/oe;l9u?7-޻%1lE9b?w6-fjC\[{)(/ڡ&$qөL=!W9kyW{_^^ɟWȞ&ˮK9{;j7kk[5 Ñ=\J tEf,ȔUZWbIB 7c#\eEN&_v "5@ZhSsA(>Gm"kUsE\t!XTߐ.QPgBȷ-yb(,l K?j͞jYG:LFkMY3yÐ7$F =}n7>sov˧֋2wOI}v'2;a(pDs5y?ݣ9܆y ڣ=օʻ^9-rm}]nObc3Ҟ/OqUUٜt [N3SjG%;8Wg3%tY_gߗ.ƨ=}ڍE-m`2.ʰ$*j:#!1+J}u~@_&7h 싹}u|-sygrOyxof/?}n,~wW߼)SЭ.T? on ./eVw?g ~snn|ײo>o޿{a jqwQ/O0]K N>OXGʎ#찍^j<MZUԕq?@8zk& jTvޜ48I W<;j˞S[<<ў[C>!j3**{{jܵEDwmMoj_֒G1~:iX>w#>cj ;w+kx ̓GnG02(*f/1S$Egh|/bZ0`gqvƜF;UX{gZ@D#oBr$ 5CM}Ά W!%NL BmJYe6vkȨjȑzp߸`uK-ݧv} UxC0ksO?LȤ&8v ԪC;ZW~cΕqfn.C*= N.0"84+gf^@kI<|@_/J}.e }맫W/w+\DK^UCSE;(F2ݘd/OK͋?뗗o˻O~w~1Ax3Љ9 ʮ l?#.|f۹JrCI;PҰMbk-4ۅoy;jn!8/ `蘻P}fNX FQV-rݯ7/sW~p}[~kÔjN[7NXg,hywZ˫_`Fc6 ̡]`vC;cu)N1ˬ u8 MU(oMO 7lCg0(|Mbԥd1:o4Nv-o$<7 2{o jzo~iä/o'M{@ |N8 :jݎ{mws/$ڰvW@,] E@ASOF oHRL‡=SrNA)-v[ްAxl<@/1ũyaOGzC8:Wv!v*zeQ _ wwhR.SJosA3s U9C$(aM|iRQ5}$)AǶB6Ysۅoym#/ِd\D fIS;1`x,o+LJ:jcO±{mR&+q7Ç Ǐ? <_Cřg$ TMsZFɚ}ynyã34V¢ꑷ@42~C5Lq!e-Sd w\rm*Ϝ$sijOY r0 q%q04/r#;H֌,.Tv`aq{ ܧ] VƖIN2 ?Vc˲j[L,Tf,!d|S"!cJJ\Pl0j2u,FCٙg_EB o79cO/kRBDv֪=BzO5Y?Pc=DQ+ɪOfgf5ָéfS*Zu0P "iHmGdžݹbgxʸ6BZFlG=c\u>ۼ G&(<VZTdl&#evG@:=7/Ina|c~~gl8ج<4$a̗u5O|^w^] 9S o7w[3f#ղAyLRcR|mWo!$;ErZ񕷾?iB8〚Bteχ+Qh329"nރ9A G0A`k7Ӱ ]gBw67PfCn9hYEpc16o$7_gX ]dY6oNbG)|FټcK'T Xj^j ҧ cl˜JD4`V0 "s0^;܎>+yv\uZ-6&(YS;zеyy7Y~כfYL S~ .\EA1ji3f{f 'Lc% M_@ ZC-O F`{Y nBE O;>+tOϳ#& 1c0V0%WodGv87ȭ.,R QgU=sU|} gB:UDa"3,7BS?jMniJ+>gӐ'Ѧ nrrL'6hODrIڐS ճ^DS;YE,7b響~( pr\nuCH9 <DőFɌ"g#١L#b0ٞLPmcRZW3aG`v?~1:8V4NAIذ&YaddkdR6o{nQ9h@:Hsq՛n1~(!p$Eg*$U H ] K/C4Pd@F'Pd#doG&xQ92 !תb Kq (+T3@4@^Jd+"""; R'::'7y뜩22(X)DR*j1kp`Tu(Eq\fJ1'MNerh: ,2ې(E(х{RB%Gވ>G> 9PͼRͺ, _VײP {8%S!2y6M6qXxh=( (ƓsҖTԉI 00ʨlHEr%!NևYؿDv4`r R)1`6I6m2BJHad宆rvlً U2UmPx̨بem"ֹ$>Uڐr$[{y]lFTL AU!QS-G>}*FUxzR]';0Zd:P!Ƙ+OfgEʖ'Yb~T#YE)VmYYn ;R%(2nwjMj&6"65l!bSBEmE?=V [ټsn/#a|M}!>5E#ҎQ:6oj(2Pp*u2\l+>i G,P~<ݓ<=5Jy;aU\\Xhb#C"pٴuXQ~n$7X瘉dh9b̡Xiǐ'=3tbx85,>Z tQVm VIݹBGdg,:0ጒ ּèf.E0 \uu:xeAApEב'7(,痿1,F̩b+ZZ3-sl8GF94DkŘs{zAۣ~w!D* @$lIn%C dpq㨟;7=kǔ9QD !t"1j)(6Ҁ7EEƹ 2iȹaCʑ숹ڼU7Z( FܖM Lfuq,޽g'FgTt ZKG5h #lE5vsirxqVZJ"4(yix cl^{Ne\CYn,[VZiTM#c16F<k5 l8rv{X;|:6oSEvvyG}Vp.ںo9{?jNm{&5[Va]~ ^]swTQmʟYEImwA ^~ރC8/_Vj^V᠂WUd߿~s^Mr9;ׂOe6Ə3闋Qȝ@-*{a޳U~vK[05u`di7Oׯ{#0of˳-sd]r)JS*b\ˆV]Z\#O@gVl/OkꟿS#| jd՞ nW_0q%R/ Q}y6պ)%o&pZ8Ob{} ;7F:1c⻞}|rB8܃}tOUTW|ѨlAEJgP=EStL9;;=)PNVuU͆[5#~>j=ǒ\XM(7u{ HIbPK ,?g/S~ XHfZ4CH~uRU\S5n_S/&O̎?j_^]wy}۴w[d9A5]e mq;djcqϣWڭ[kPfpsQk pb5+ӷSZ>xDy&I)? kog˺ F$-/I7+ÇNma'Z=J9w{}7qY_>#Y,d%VA)}Մ+/ Ƅ )G?_  Cȩa/Xv# s8U`6'9:8ԝH݇H=֟(rϗ=S%qs aꈲXvz@*zQ+u"0R%DqOh^,HInm}R!<$-?K>̇I"pev)0ϼ9mFC' 0b#/~=Ikhy _K-[~ǏVz)Nr?56&,%͖_f+6 l9 ɛ_?a˼Xn|6?};Vɻ/O.^M8R!+=Lr7pj$:#{SΌfpy꟮s^Fjz4ER d_ų{/-j !6GS(l@!B1hl~-d΁/&Acv\/_hyry^dHlzB)5A竳?'Z0N\/+RhVsYp޿d 6oÇ\֏윬[˓ l= ?j&K9D% Sf4"/[lo bCb]q"IЛT"2.a/ kP|gٞlۘD8|ƻq~,_]zΆB7U |2N{ΰaPJ}HK? *zPвAF7A>!$e~QI-(Ad X0chJԞqP7t[ Q 1d`*zCHצc4'tw}tQm= u[Lg4{GE ܠ',4[lw2F~ێgav~s`(OkGM(YVGF:_Ƒ@ZT-+^:]jw,+6oo1V%BM~}>ONٟ y7B~X$|:[ >T5t Ky0-[%Hiڏx18V{p{63{=oj4N$β|gbWYu25uYtWr0w /~=#B#,<ґ#Jcz\hh؛'yݯq!lDGc'Z)l?ւD˂'dxu:ǜzUk܉솳=>twe=rHWgV x-y``g,-? 0eo %w8fWfWt]/ې}3F(*I;ag'I:xX6q3h-.'01:pٝ]{ /S [qj R6Ao9.}Rԡשq._a񱚞;tN@3ԲFO}è4=pyaN :Yx*aƅ9Q ĈW2}PX^]0vorzfIT3Bi>Y@iAO@`?*<rҭ %y= 䠕1yV 7fK,t FLlɛ˧Gv1J(18W9J,-1*L+0l0hP8=ysԀe0pbYܙY &{Nߴ/0² CJSH=iM;E'xFYj&u5qUw{c@^`a8WSіi;K 0I~ې]F3Z r(kQr?M >,䙛hgRL޳Mhak|Mرs3:vxTD4Ud*p@DC} K$j\cVNMDRG\P")qTmFX ND 1?":XO3]cx *B 3R;) ;B =#h4Zx0-BHu$YP!4k4!]jØqAf +C͆Ax 6 J.8ry-F0aQWHejBAR>D4(qEPaBSMɉg= -Y<)tPGqF0 E=38%Pch1qeH8VH ˎۨ8[&Qآ5JkQ'w fTE+J^K RChp식p|Lr@% lf_Ww˄Uk2f/=T rA嶧~o{BU.R R^8)F(jX.J9-{EKDv?]*&?]J%rjqI3i7/|-"Tɠ9ة)=X@YF evab2jCSJq/,'mV kp=gԣQSA1< i0vUdRCDRnaD$}G$PЅV9+KiOsk=nl%cZm[eBbέmpCQ! kk%Bq\j"2 ZZGunIi }}sw63ҏW7G)0y4U:ӃWZ}[m+շ@Hg]%+8l/Pg&s[kBzT22x-cF2hX-"߱OE\su;S ^"DkiV^{B4FO\@8mhV u s8=7×pɺR;yQp!@CY5&6JDhYH`\R:) tRaDhHςi O HU!r wZ0Yc4 "< aVW4ĄӀku^eIĀ$@hPkR3ȓMfqށ}_Verʕ^˩2\l;Rq;k8LAH5ދ@* `K.UHҕȲk"M'FJTKJ&(mlj.S4^(!cԁF5\^VIFDFD!8˗6"t9^<Fwlŗ~mQ=埭zϟ=}.,'h֝_%ex7+֞/ ҈L ^\}:=YP [T:KޗJ¿.q~"LRJ*RӉMPr8jQJ%J)0ޫJ;R(V؝آVQ)})T)8I]-Vlg[+\-ZSʢ4FD+7{"%i%pWpbgh߹2#w0E22)Dv7"VK)4Gϴ1?;_rIy4 㩼n*&ӆI?m 4UsNԌoXP3aRe*U5_?pT(j;R㩫 Z>fP! BVc8&gPABxqa  !YXg0v ?MFʞ2MZPus8>h)}/it&ïaVhj oxBY4Vz/?&@140Ãԁ"ٕQY $07Nq``e-PzndMh}7؆v0Ek\C q:D>#ea&-ؠ݀ef@OF&Nip)=DPnM#n< }Ci΀Y2}40L(- 17{_| [ T=JfwӃ?K5aSUNq˩^McdVRڸ۱j}9 ?m=ޢ?#,\>Z1oW6Ͽ~=x܊JDU\lht>GpL9N# d>=gi 2jh<60z*}Q[{vMݨ֘$~V@^l>)thH XL}P y>L]W@ľ"ꑾ;/@43hECl#@ZMtd@qnJD$;$p+`]R1R mq#aBq#,O&?nCܘ*QBP\jqQ#Fnyz' gǍF 5[LYh֨Ԓ3fm ` BÛn?Z%Lx`&Bx>hn7d#@ /7UO:oet&A>7e&2>A5ZHs7>t8uNzOg rI!rXPv 7 ;VzC3W4[3W>h=q^QyR J` j8 >*\@&r{ߞ}=ݞpғ"?^]?T(F63HX&?է;zɲ_>Y-&)qwjnӣkqi~5 *KcM;uZ\`"fta6+g& {Pc _pbqJazpFD+!~ɿh O !落B iZ[eex}kV-jG[qK;\ /u]d/չRk+JqPBJP,Pu݆Ժ51Z"i+0'q~v2> c nӉ|ܥ*B7]~<u?cqee|+XAY;#XJ::5+ ..f~H:9_Q’yK\+Pc\HҧpR0' y&ZcSƈ6=zZ rL%mL*ekEC{Lևn!xZ rL%miͻ3fz>,䙛Me11Bv]O7I2?t(^ؑvos'Aד;.u^KBRs^֞,6crM{7k}sZz~3hZv ė Ď}!u_T p3}8ONxyI3%LLi_K5ֱCw0^xij ל^fnA¬*3+uBukn^4{X5.ة(ѳyKdwLxA59>0[q.lwSgt2Fgj-Iz[yc%ŻU]pP5~RPlG/'N^YM.9\σL.8:?Oփ˚[+j1>n㲸b^<[Ǐ1cgښ_QeOvW2nQNr2d_r 8D.n?)[% AQG-dCtH܏np揸^@x7/BpG_ށssҵ5GI5})TPi(7Ζ4-%NFESZPf$H4b-D`>5 ,A;>3D٘opUg}P%I=JpBAwfvHyUAғFiXe+s >'*5e(=i2R1@sRV6J!pƇ|g\jV=C=P !ymU ~FI0bQJg4J C)QRp >'*bQz(e3>g|@(e3~.5ӉXJ(%Kz\gJB8H{W&3JO|"x/PJPJs`QJIJy$×R\j*'ҠܼG)H %>JM9'Q q)[zX=QJ)CP3~2^z\*z>!=mRR̝D4Q|?mDRzҠ'%;6J!pƇbg BKM\x(W(h'ʷ'K=JM9sQiJ1sJ1 C).乢QJDJρ#^0"ߐ+JNT՘×R\j6J!0.-j3^#I_<J$DjԆ$IJ,T\ʺO wbY{~Bû_|[7kJ'( ?uOv6>E7zQ\-ܒO-_ l0WD:I"iMaW[_~K~{|3ޝV;Caf+0U-Ǒrի}%uwrK2$ȋWJcӒ߭o?s1 2yzkF@5sY{f'[zZY\80/=vz6̃61õcH)ۜC R\R %z;GpQG3TVCDZmiV5,X'r%2 S*fQjRa1b#U 1k/U$Ž#A }h7BMG[8j1W*dZ}TPd*R8c |c[[:bxѫ2YxՙrU s똰9bT܃*Sy>W $rsφ:G?Z={{AiΊΦGZi80*wwW_?.?N⯃)?ҵ#襳4cHmDZΌ{[hG?(^8{Ἳu-s~J抗d.}7[un?ge6Jhݤ|0*-jPL݃#pmr>ݤ^ɏ^)y}{X&yAן-ye)wq'|gc#.߳? oe ?5Z=jȦN?~vBI([=.˙M?鍏?1RG񖬜H_kKH"F\LcABCF]p^7}hTC0Bʓwo"NJL{Xh;w4OJ4-%=S=+Ld={pP=\wA!FM7SˇSӪ > EIja,x IVڤ~ŧr:I.Wkː'\]zC11W yys?c?B|}Ӊ3d9sfNvC#o`Bw.ؓľjՖs.3 ?-wl9S$~y!K{U QqnۅXo@-rbybrʃyGx  S) PA,K4b&8K(B NSM >39TPf>! mF+3ETbÝ>IW3 " ~:x/뷱 ݭZ] l:&zrxX VUbRx.Cjyf\*ˬƌSA)IhƱgzL$ՉJqdT`4BؿX.#{]F=+BOeE ^jHLˆunY]lDJ]mS, FkL"2ʼnRcD֕`*T.|!g|݉e*ՊW]w9yJOug>x&#?+}k*~z.-Q>㓮l]d D\!o"w?Nn= 탋CD唃 7nuՀ~N]\o|\FsnB@ʑ@/YM|60FbUȅ\ WaywyΛ2@rGF}cuQlB|PXkw|e u1a)- DMRX*1RcEz_S#@(mxpafsB ZG\Id[BLKM+>`;qUjX?1)6rV["彫kgHz񊋚w&{Hܸr4>!r/;xe8 k+\44_frp6%yP):6W7ic.л/1>@nqDľw;Tg- л/1%Ȱ]J CK@@Xp,Ml7œ,{ ; (6IZ]PSwaI8:I껦I2@)Q> 8}DSy9DZA\&<TʐO9je={zBG]exR5SC5E>8ǻ ;e{ {Z3ʰC@e0nRH.ۍ\-*% {~ =5g 䨾݈`EJvU=X цa/l _UBU+^:<#[AL*>ifUQPhjZ#ц5waEέ[VhYTg/6;+6mu*A!~Jj aF(ac%ȱ7VkKxiͥ\*u!rEzKU9AHq;Oo~P>;,> h!_8D03w`׉AtbQǻ/6ݱgz6C4 SR>Bn Rw}=ڪٴ\wpPCT\LGHwXvr3y~s&tSɃOS3q䫳&w ;܎N[m>9+lqj䏻;V;}{ 均[Wo'[(Xd>f r\\_͍/>-Wr+),rS=6Ӊa"5&Nʙ͖ssar~Q2^Zѻ{MwF رSeϙf Mt":~2Ia(CE$oD+atQx9g.Li h B3%-m.El3_We3 xլ4adj%_Gv1"93(?\/~X;fsw lpIJ$F* P EHHKEĸI[QHmIU߯+n7ZM7_:e:u516J5z%vpDc# =|19E5$^sꩀ $x0 ɗcPFyJZ8\)EJV@xJ5N-!֭>e CX0hE%('}+6ǒgZHr\UnFJUv'dWS$8c[.Yc) AQɣl D6h4T3H3PTTY[w.G4aRuXԁPFĀP'w׳?+wpcV31;^'V(ڬ-Z5/,6E-5핲 1d[bWP*==;JSPtcTUqv'@ J\Ewkc[WI cv%NI&B!X* "q(4qr.vU<:@I .z"HTo ԎwqqT`Z e@Ua_U?뫺AW1}()+R-戚Fر%B\f.u,՚:aVNRRTAjw2 g,q F=| ZK)fp.OIeT)KXj;A2h'm*jĎy9_iV3ajĩd;e s/@J$qCSߑ %’$uE2T).s,%IZi$QfL]u>4éZ ;l7ARAvq<5a\d UJ+# ̾z,0sjPe-F3TVf\ xX*q6 -V;k|]WzX ;Q +l=i1v9䃻–;v2Z [OоȮI ]akצrM:=?k3لɭo ,5SSk۷jwZ3nJ/F S UOoҜjEGv֡ןMc^ ЦeDc6Z'4%4% [x9.k'DYv{k@Ksmǫx`x/ T7lF%.e<:U,t#lBf5YJˍUYCjTwuoWC4Q!%~JiD=>*p6/)\heB/В0oܑ302]E>/N:Z#*ˆZJP-( V}tWsBtFy` #d5a2O 7[l;B{{O8c|~ropXeR"E?p}75^c!PVU"+FVPCV"P#h+SYFXZA>yt*0,j %TRߚm`َB! 0ݹ/ @aI=%İYT %>0,zr*7V𼓰mjYI-iiE1uLe&m/N)Gwd|SC;; z#32jO0A*5;Ljj~ ВjC}]z,z0U=V+MO&Ds? : LTKqݨ5UF>mb VHZW +d{6yhEμf+U4..׃߽+r)>}GIХ XGh>z{բK>Z.b3hFCi8 @x nevW)F*,Ä-!C4 S{}5P6)Q :﨣ݎgt*jp9޻EΒ Gߜ,)ETWg˯LC/22ۋ^'7Yv?Xq\pTN]_!$&JJap[|[YO4ON,g4C !U$R4@YҌ_𵘌3xF”O1^V -L(G; >sf)+a|J@Z}Emr|Z`pQ@}'yM zzFyFlne*H3ɲҥ*A9:pE {@<rPrBSG6,oLRr=[olDcƆ$$\UYp)T%OοO_ܴ FiPb.2 ռHSD)PtLB_Qz($(y]tt~TsgF)a(@)^tu_z(偺Z~RK OO'R`a(ZfiO͖Q , 9Q'WXx\B=:(==% /|~)>(OxjRFP|D.{RFPZP9 QWx9J9CiA5g8Jü0#($_:\Q\^4̫WQdJY 0FAIUi^gYP 8Qi8m3 栩(VQ h2h֫%ĥ0aLf9P[:ͻw ZT>/\i: uZE5?V+O%K"ofu~Ri)PM&B!X*IjP*AMF!ldJ &+.aO[wT,ko OwEMbXtbi{1E߼(T̈́6E+ZI;+W]) Ų>J((+D)}M@v汷6:&d'?Y NqTRLGkDho6VŠU .rR6=XKM˾]]=|74FC5LJ񋃋e.{1~qONn+N󊯫+MC[(309P2 a*@ bXynb.B1 qw lnD9t'~yNLɯ.gi;z),xfh%7q*2;x ps( W}iYSIF'clEdގL+AYC5㼮ԢngTW "ZY#EHJ. @^k J\=\FL0]ɺifif\؄TaTL9#T&$xA<,eVCsBMpnFÃX\ohf;u g}QΐUȑfUfn}qӷEV[#GL#ċA1l\*%*m$I;y\0 o#߱.l"8أs9݊ D%~Še@PԦ$!Y&i6_n9R2ԼV)*t2Ke|'[X T/ *%ln1B ڔ(N]W?_{T!aê >a6EC,6*97ۃ7h8(A M0v<FhB#BSy1Y>OlP ԈDJJё}/]wCx#[kxn4j{9_XGzSPLjW[wvL;Vh֓N^6ck[YYNaΔz&gk|h`nϥ 4zo^i֘x|Df?=Dp3tkPmOTa^N6 ^6{#ˁXaxI}+wc-19q;wY)'9Q| ໻wW_=p|nj/O۵%yđ2xL2Vd䶏\'%WZ('ZҾx,u.u_\k^d]+wdjr7 _vt8-rT31htb @8JmkYvl& v|[yN_E2THRA`H,+~/G'W)hSLD4Z^7VTh 6kT#ʳLJA>j%4/y3/3Wjz3dW7h=d{6yhRͼST,..ף߽,kľO?NCZCl^/)h/ԅP]#ζƋn߅ojfQ! l:ܧ7) [>''[%lTSՅG:gBJ~6xO(8|g=uQzj >d]M<nTѲxֆOIK"D2\IDtEV]Qii=] .i[1eޮ'zEq@_:epqfhPc$^hFqToH=](O]m95xBM'njL}B5V0)a׼R(aRA2MI 9hi$MZsb<;0stؒcc d*n#)Swյf3ޅ8J;r}%5lS9UR12{)Kg(/<+0\vʤ V"Ms=:\ÿ+OJ  ;Jߙ7JUHѺax?{!Om S53lD?XӢؠ)P4KX ibLT,Mmq&0SF{bgRPy(O UXkS0 7ijqaM9h&-L9d4{6D@ 3 ?p?C&2"/X"4C *өVy =}U3ŵ|0Z0TutdTN0MT"xD4uE9/ MgFH8AR*Dq+3ceYϏHCHDIwl+=JNJPˏ n9Sw)|*33Shr?~I]Q.'.%~$@ i0aT?6JxTk*5Z^<-i ˫5 <fFII$U~wu O_MՋ+#.t~;!kq46\,Q;Ըq5!g}ط6|ep;%\:A='sأ[WhttT_POՂ턙@pAt[o4|[5Zuo]rI(_/cs{vKg֌^^-*H=a=7{}yuZ<>V4 SZg>9Yii3pk#Б {;o?wqS5V)I 5]_)I;a{N6*{Ĩ컍jcBx A[uxid CjgC2d k `ymնX[XACxԄjU>Qp5 X |H%c{ڣN,s| {چs)d[,>6\(C(!ɀޭ y&eSc8b11gxWcOֆsݳ)iSd&1~/qg=+yV(UiǶ.E}tM(t9l9+]xRr+L!o?lf_UnV"4bt5YC =P*yS.P)!#́rp>Z+V҆h{,l.BY7Ls(5z&U甡& #QgQDf_fܗkԥLĹ8䭍1H vw,+~Ǽ>T OPt1?̞Lh=49->u.Z^|MH;Uɧ\R9MS(;i*atƛ,UX}rbj3TA믒S|5/ L4w79SO#ٙNKU:LyC%VzRіsэքR[j^ަmM )De[&6F+%I$062HDQʹJ& ܰĠ6ZZB[ @."[rhWX_=l`蔘dJTIxI"38BfJ2KDIH9tGP0LuȌOk6{U6lU :li9[ʫ{WaQNRSyB"!NkzkA+`5W Y[ZMo݇qi A _[B0Fo\E!bU]$h+2f)/pFf*S $;Q$q" COȌJyCR2Q`,+ʚ$Y891Y@TBӌkB mQ{HQ Tj-;Zb?Jtˏ87k2NSl:~{xQعrcWo8Z2?o؞׬R?KUY../ U3/rї%+;d1Yf+6t`2^aʗOizg\G츂y79 KپPyZЂ 6SjZABG|oKTuA{uK?gZeRdr^`@SaqY&bס<؛#mcߟ1bǖ /Ѧr1տ(e`__+x{NR*qЌL9$4Ig4~hϽO6P-b.[_^*yP'[Pߔ-KQYEj}jvoǚ~V^I+y=̄B4g&; dW>~>\ЩM<-Z2HBIDՙrEE>O8.^qɡ^WNV6u][z!娎u(EO|USnTLP}.קoURrl]}_g 9ة]>8a㉂ctUOyՍZ~(3|>R#f&X[tFF{ʠ٢10txъk^>K, bUh8-M$!-r"]M U9g@qC;jXob^؝*F M5ax$YBX ? -"67$1\% Ź9Bb`4DLOsɅ9"br,?jQO7+ؒI"ɠ8 si0QdDgR&̐$UD̚L!H_p,婢B"E2m98 &F;2Z?k^؝rhAO5W$3H3t:$ETE\a8HDg,T} 0)zI43ĠWRKPM)$PYndD1V0N 10r!x֘F%g{(B'&#-BdJe`I a8yfUC т aкqO JXh7ToFAfT i:pwGd귐WvBX'+ FfN=:%TC棦dg{$ PW^yz1+VS]Sa1lLܴtAPgq5ңYަLF4\uZmO>\~[v>\^>[-R!\[a_[~ kmy]` =Gw 0 Tw{|tA (?6_s{kw}|hT+>Hh/vow]m PFE꯿px)@ZmDh9叫IpqNΙ4-닇}3\OnUWK?fsve9v{ph \n^^}S.]ضr7wWhQZz ]MHӃRL9Z3]|>Nv(4%wRs̺lB޹FؔFzvݴ0R b11gxNDQ'׵w/8nmX;7R}ɱx7CAbc:neNNY嬹kOڰwn16YÕit^̨  u!Hz) \ HLNijМѨ)0ދ&2k>jjL(5 `0:%!pquM4ʦg輦fcnN3Bpс{ߣeһzz6,䝛hM Gqz7SynN3Bې ?ӻa!D۔Do}[<ge uBQJs^atԳ9P:jJI=+K;cFWOA.>y{qF&&ňo < -ЗyW7Ci(o/HSV< ^jr  C>5V@von]cge*WР.,ɗ8_TﶗR{\ > <";J3IvfdPFի1~%2w=S}PۧjL۳#WomVէ7Ow ;?8Xucliƴaݬ# ~Zq̰VXVE.N5ruQ#巭GpSU?FkrD8{:3Kv45w1<պe:e~ 49) n~ޒH)$X.lf='^j|OkRŦ.o㳯29z'XK(s({I'.Mr76W >߄DyH(}$ȶ1s;v1zVa a8Ӗ:bY-nܕSN"f]t>eVbaaӊ*@lY#)5c0+83Wa7UOcז3jJ8m2VTq[}RB\A\4eF^쟯]\$RӢ$L˟| 5..-8^˭1)ZN7nB߹s1>^%Z^ռowU:UbѴCmzVCD"qhK낝beRz| L0[P0%IQwÇg;su}l{ly \P;N n+)8%'V1rsSmu '[OֿڎIgeIݲM*Z{x4jjQv܅Moe=M$$fl.nbyI@ if{'Hsզ,wD){rXbpdw5PMW24COp/,W|ְS&t1u(4R^mvnYt1\H 1,c"VO0s"3Ģ7[UU`(=jf}^y^;WX5R )p+MJq$cRc$Mn_ R X(,1Y!~:| l¿FI#Ǧ&t,42Ya&b,SXLmRdr :Ԟ:̙L13tu]U:d -Ct9_u5wFEZ{&8~;9^#Y=^ܙּFy+_Ϯ8gWk˫+pvp› j ^b+IUcpTijBKY̎lhc=ZWSA7ek@ommC F^AMI (, '鄲\i0F"kEZcK$e]I :k 5h}  GڶRxn r3h~d#N Mb,.\&DV" С͌ &g2"mTb],duCa@`t$ފ:r@!LD&:VRwgmlr*8sngI/rz/?MbZ3]fvGnڋ~w&s1v}v4G#"CS%͍ʟ~z5*mybn(-:߿5_Mr-{ ?ϭVSfqT*m]EPS"iP3_VxMX${?*ϥ* ջ5 /JrNxk䔙4sF#[%L;I\ҚüJ7BG-,{/1zC5@9$j):CTD쌏<\щta՟oZz= wUkWt]T`+4Sݱylދ;v1`fQe\0:thpW λ1Ϋ,u]w}4@x!0yй0 Į 9ḳxnT:l9ɴ;olR0+YԒ>Qbʇݢ [!qmQ(ՅR[rT>${Mۨߴ@6t|B9Daʚћ *4ՁAr6*qxB ZiGnbB9Da q+ VxGT>iFgnwk!%Lyޠ|j ZkmފsyTYQǛˇOxQ|vVr*n|}o Z;MIPa]4`v0d;!pUBI;c5y u|I>pSVڕF#6V&KS@H2erbb`LkvT]!ѧdGH4qq,`VyhG_P9w l׫=@>GF6(1w*F[FO g韌,J *OK>|A较t) 0ka(kO#3TYGqDѰYHXD ICt3HWȣvl=Kw2kF:0Jne0V#U!WŌ!یIW7ɐ! yʲ, 5jO5ꩰư/D6֑Q֠$'i 8IV2Hkf t" 5O5d{:vj;۩UR Z0BOV&SձJ@d#2(2h"sYZ֙lFt‘)(v;h# YL.?-޴/[ᶘDRKtf`*Bp?7J pʁTCCďmaHZ+6l C n+t8n{'Z,-DžgH!\He 7XJ2!1`4tHO%p*L#(?\f4dQ6r8WEXZjtBD>ιsIN!ab21V%lbIHA1 [wvûIÈ"J+{gMd{,ݍv%K!gttwMUHÆsg8<;PHZd|yWka(9w Zifc n62cxA>yY*C {\w+S6rGϙm w~bzuI2?8d*~|慃慃慃EۉuL#-ĆaXYdFкZ6qB_\o?{WƑ /s̺pZaI>%cвU l4ꮾ@b1fw_e]_.$_]e^WɰѪF *X\|6˛4%tPO4 rD3|NubM3*EĤaB1J$(i#eJta5S-]vЛ]^;valPIp0t,nD)µ#xf9Eb#de`$HLLEP2d8L&[@7u%N&oRMuRIpJ'֚gQ7;lkbQWvғRg*\ jH}e= ϴvQ8oBT+|;jrR[H 9={,܋hG%r˜}%/T1`/af5}G'6to5-bشaӅ mjhCR$OwZ+|Ob7J Ө/bޢFNwUU6d5hm e5#(WU~R.K9>ߨB29|1B^9D0%`&Bt tBQ%v>-(+=[y _qnE7N n6N7D>wIy<[y)TwTA^'4{'s3߱U2|P2Ms[]j=s[ ĄBTKE)p 2E]b(JX]TjLj@J]$R<[`F،v7ّ͈a9tFLW3ʳdL\d΁j-KgdGmONE9OrDŽ ! }:%ZūfUÝT4tWPe˧;n1* QieƁ\R.l!X>v'X}qU-OyѺˠV \N,G( ZڎoiniHx4MW/ᦫi j6Tn2aFO Hƭ?ԀlOÁVȋSh) y9ԣJ-ZUKKR3B }&&+3{O)@ C4S?SoG7|-> Fm;yF&!)#s;Bz-> F.Yl;{F&!Z”'2.k8P>vsKJDvMH(5?4՟L|9yG+ٛ(lnJO4n̾g~jtYiHdA7(\g 83xyx|29!zVoM4-9hLx+dssq|~ 0"Jټ }zv;4!ڮHY A/7}BQy(~̖; wjnjh颼x.>E>>ܛog<%_~2O6dƝ`'^Cw|,Ⴣ?~|ǞB$~wYe4TP^< >XҚ0-Y*L(D)$1)9`l4L$eI1A`HPЬƮԢP^zاx&.K ZRPJ7B5G(=$uYjKOqhO,:Q  sJI}]ZQz(3>g|. P=g\jqvFɡ{s@smT|ݱbMfoԌWw o$>ޚp㯿InִpϝJ3bR4,d[9(6FHaެ97b="1+KvӤ{-ܪ4Ԍoi~yBj]j^8TNRqJHiW*K+FaH& TDIb]N2RQCPȊ)H+b 8k%%Z_+d 6)vK#i* !:չ6?m'flkb|s7.`< 2/өJ7Ot٪q 6Nw,!bOz: ~W"3w&i^:~ .D9,9;zKTAy@ku4ZA \4T;aMDäÎ8S2FXG1KRƎ8L:󆝤V)驼bPU.K\UyU҉䌉FJ 3(bHSf;g'2 1OE!D<*K=6FHce18[y 9s} P KՙQ49+PK \YN3G,YT ³t% >mRC) CR_Fr&>q2R&ʌ} ?RuiJ:a(=$uYj^(=Er3> $r?aB$^3{.!!0<, |^z(4{bR*PK-'RS}:g ~+y?mj0f|JjN$Q{j(=Frp(U\jEnHg&Zjg6Jr2T~9Q+UjSD)0?փ@)0?R,ϞQz(Ei5ZMGT tnq]2"M O3Q"b4gִֺh,bgL&Kf%d^PpM`@Cmgn#4S73~6~4m>_7-z]hmF5Z%5x"Bf@ [{]HoQkWA*Դ9GuI\6\bi65/6Fb)+ƚc >IC.wNO%dnM[oC? w@[ >{ dGg4gcb3?OUg/T&C(ar婂yC/T#^#>*/O22 *o&QPh2T% 3J'$Rv'$&G[]t5{xllt({ uq:1k6HP(rRLPdڥ$"ÔAVd_n*%_M63@x՗[=u9SbMe-rM~55Y8]'.{_]IL)&}tl$qFTJ 4EJYwYFR(er-81Qv_{n o{o5(xG4!(15F^ :6& +'&3qQ /"~v_o?~_ֱ؍Ym(}\ jRnMWl„ ==vͮ߭\ ,y3?on]"ՅUZ v} Xf/S;7QQeA,onDIq D]|U>kv8 s1چ|r~BsH}a6"~_2ﮗ"!R}xn}@NMʑmd;W+]䗫ǻ&.>kbSMPGGi|DײsNj#F\< RҐsF\Vf9:g\ hʴ<.Yuݦo7y+$h+fd?sS{6^'ST\.)f|16Σ`9+݌/߼ zռb%{NH'([v'y{=L9 ;U~b}Sݏz8FLeM˩5 OMT~ ^C=@4>ܹ 뮫Khet^-F#]]yb`4Uzs/~DJ>tJ'SP6+U[ul h^D[35uKwKQ̩h| y= W4Yhz/ ΎT+,/%YV+.aW K-(>'ڵaaŎu?lv;Zŏܩ] W볇̭WjR3O쟓iwsW;P|,|nT Ͻ.{3dWCE>~ۧ} ²6;,gZMo=vW2'(rywB^9DC0hz4-$BmoTnmrYt AnM C4SR~xj~JtkӓT[xigtk!%L'`"$J7aPe7%鑝?u8Q3 M7a,)L4IT+qR"h=fΕѿ[AN؇S꫰%~:}{Ku4(R=FrdR_SKL|%a%C-eeFŇ1M#:Fk2DŽY,0@ɆG4*8x˼\tx4g[CRز1+(b|M1`YOh{mI 1׉ q2sN)Mv=H "2p;h Oa4H 5JjH TadQ9LCbUjMbRш, g2LtOyU r,/ec=Ā2[+ܢaigz#ɑ_ŀÈ`hxb 6xF&5R9l7YUJUYyTIY`gZ ~A8^Ag;)Zg+]EZ&Y#i[cElI !-$x M!iwUa#xB` ֱFs 1;=2B;AQ 6EBᵽ3۲k˪ ke!,@"^:,+]׆:1zm4]ZH]HցO!I[]Ϧz./jDn䳻Rx2l!p[um*4ͰErNR1{־ZAf`1{ݫ4%Hqn,R2Ssm@@B™[Cf~NPx $dPU/~/\6۰oW خ1-S]]헻orfeSo!K04_MgyYY8dlZs3=?d [r#Ј=&_M| ZFtߙ{m⥾ymi/ZRBu¾{L _Of3xlǪTEi"F*w'}e?r~t'eNРAHw(f[=+yT85C1"ӂ bX #,,ΌQ'o~!9j6uQ/5Zhax,紈_ՏO~iӜ>/wۢ)a`vWBodT%ܻ[%9)jƒ5f2ǎH2XC'+ fq8h@D%ɀHR}aH[aF*%;ZY]ζ\\ghubK޷.{I  liTηZq׷ij `鞾.8ޗPqPQҸ'-i}v?塆VxDkB!iV &}R۪ŞoGQ wޔ $̧諛IøHC>pjBp#`2VsFG8FWG.<|Nو-S&1ž޵|JX-BG2Q(IWyXRfF:렍V: w `F"I,[Gg7]Znsu: 1Ubcm<5"NEm,F-lDm M(|]G9*t1t$3s-*2[JjdD3a@NZZ뮊WlBZiB.xK5ֻ.km0avVVi]!L 4,1UVҢdj~>lpExi9fkד\$Eo~G}@WG}ǎV&Sboc3w?ljD=-u:Hmspue9?ȩ6߰ڂ )\Aˋh\xkk;-dzW^Ƅ_+:׿+,5c7^@E'kEmi4maecq:M}*k@S O g 7 eIdۢnzvJtZ!L|P ;A6] bE6,|H/fA!m J(<;Y+ɯB$o'P! 6S޴ЉS,ҝvAXKb=BMb^Pw,kzk x);Z#dsLzX5lޘ'"Hfɔ91`q}&i&U:bslEV28#IbkLX»R 3,]5;VِޱcxŃ6JH4`HV I?AˣpAr}l{W[qfmr՞wLa+ej9*gE/s+Wbt $8!:(Aؿ}<6Sk2‹ _/.5[aζ';+In`SB -5r.JԖŠq%W 6q/BEh,  բqvRUoӕ A EHJJ8L Ê[i|b@\&麌\[NjMԦ%LLl=2Q4F ׷1^]GTA:aGGZf 0Mw)0WytcD %XH]h4(~~ b TR#VO]=x+2괻hkVƴ"ZcfпZu+ի>^ѯ{N׻GH =ri,*$ SƼ&i/X(/6,t[ ąg kVސR O\kE|9x|Ni>v!Ht?:^^~u 3ejShOɂڜBA3EF%,J^֣ѐ@e,P4mNhnO#YlVm$&5Ԝ%t]'NE{m!lo// ʎlEks]RׇjX;[3#'` 0ƄFe9Skbt(BjV;1tר0#w/(cu/(yU7F)f`VA@[k&TB73Djk4/鋨b30sܢq֦J8Wg5&|vNPi0=+d]Cc2K%>"\Ach]4x6¤eI`(b7/J^]>XA OH8^4K: kQ9F ڹߍKڰԙHZxA {]R^v싕|=̦A[>ҩ^i46R.M$8T@6B)z#`qNsH]$StGDF STٓ ]/aʙZ n0T:uRec%:MnzflP6ӜT)Lhn͒_J**]4 iM RZQlC]8 ucQI!Fq+YMH5HHϓvL(fдT4Vg \چznez};¦Ky0! Dk0y2w 9~ֺIl׾]v I[XBXqp}0%yܡtzU 9i"yj!W@XT5XEifJyj9):ф%K/~J2F/ũ}k 3m:(Z%| "kwl˫-P8K mK&ҩN22[l'#X/CbjF&~YUj< d}5r'1j 3h̥(H L o/)&+j.:%}" *TTg]/~[6Y ^6V@1R2jYH.t=b1!uS7"--CzUzu|~s{b94\tP/od^Qw`/"C&I,=lDȗƒJVH}!0+9󌭍NZ8[: йuK.r)zx_㧟+ly˅-5<_STiZ-HJ5R0ڌrٵ-+k8s,~]IYvϓc#~0_&z ֱ[ah{ԫ8_Hm!#l{K춣 y wRjV %ua̩jIR\V)s%?+JyksE%oS, ˤj <_T}#Np+MC4iĭ6mHd'4<p3e u,8K6uNq~;J1,wn_lW ~n>_?̜nf7_?|9/(v;i\E'{Y7#IXmpLX}5,^8u^U@CC*:I,|{fvmj:S=Gf=u^@CC*uʵeUե =[]z^]]j,>wf FR"%q)כ&D~gQ^>y cHqr;KK*YqC1H #5p{]N.ܰG8"loo+ʼnoWbMPv$'}@ AhIC'n#5&S92ޚ|[g YS/VHD"AŘ^b 6FA2 |@]9@a25M\ AMu\niTJE [/kgOEcPlt6O6bPZF/$g܎3J @r8۹T@{OZs·PV*u]Hc-ˋ|Jg%$LjU+aq x}kb'.Q PHeP!Rtm<<=[p M/4NO'6Ÿ1#0Swzzٵ_9eU$6'Ԭ^AGg\6qd:e[>E[Xٯ&p68ϹPqKHgWwPK┹]vrءs<e"NNc J͢"d.GMvbwhb!6|Uj#ev$sv*Yf^j2-H x"{@zGz!=i|Vc֧84߂&(k#=u :cϺ[7-\qNeUsd5ж6!?|Xϸ}%#rH! jQym IT*M2(l|vE%Ǒ)*Y9C Pv_^=(ymBԘReƤ@[b2@m֖=}#j5nF~HR*2)z*YN&'"\'53䄁d&2s 9DkzNmNFKb*w3y?JyYCR2v,3[qHҗU28梩\Sch֫Esc<ȪP'J²md.WW9b %elQg뇕5e~Q^O F۵mbrҢlTb 2f5PP9aDPVSC_61]tnJgQGqp:MD8sld8oIekMBr*g dwY9vĉl4"}GMjx+xhdo+n[!8XEW*w3L; R~\Wܪsv$_\5 "MB@-md똲 ($i9=AWe`|qT. ;mkV\rH,,U B&vwz_'(d)XmKr|C`~oM;o3-(7Al:P\7gY@*QUV͛X*䷺RI dΌ {qT:5/ry.)LG!>`*(twICaIg|J@m`l 9mmxs%vN?{x*ԑt Y-+A 1d[dS~^迎\ L:Ol7;IIn,  h#U$V[Hכߖi*'NAvۻphhۻeܞt<|B1*&E͞.x>4\;CbؖbEf[{v_ܶ?LPiv5D_ԇe,v/>x9R?x`f l;W}$緵E+nۡc۪KO/Ro6vXUW_'42ͫd[zw[5_ylWy湶iUWFQmAAu$ #?esd~3DRZ& -ϴ}0C0RdI]2V")\ BAl=Iu8W:HJhPT\Ώ"%JUڑHw4:c:ϒJPpu O6IK7C![3su穮ӱfPguގ OewbTXr.rd"/nEb=gU1UNb=\=)y8g>z QmrTq0H7 sv;粙/[}6{]tuZ XVY[AV̝B {5tR4Cy'8ᓾ2񅂨iPhrN#M ޷OKm{ a/.CP@Vpix9+ `(bmike )$i+P-x @Ka}?;/9_1|ouENl[_FQ8 "s(oD)~ȐD;NJJد >A`q1(=[R; MJRˌ[œ)ʨk]_hlg\4MUqD9ᑮR h Z-~c]֠[|'e919umHD0) *Kv l]qgm廀9}h\Y؋Q~J(>)]_En8otA/Uoeh;nv-$Nq٫/5설qԆtg + ]Q7p8Gۘ7qwg'3z_ж7gDrNsL肬3=]pzMJ;c,ݳ]r1g{_]5\sw>:QіUKMT%bu@ʱ)U6]*ZЏs^Gc|gz:_i^ enb2WjR OV(0YvNT"ț&Yx.HtN'̽L 3rԜDoH;|@$!)wm)&)f\AklI䋆Z^o48 ur`׺7#0*] XM=]:7ĮEY x˫ }jKΤ=9+m>2-g>8hk"6=3нܤymMWWg\!k!Q@.t  5 +&/owYGjr\5FЗuF=d=|s'd}8DxM11ڨ}&mڕيPY "ӽ3jn-~8RemHa=1$ y8]暗~h*7dB~fV--x&!;̰R0l?ˇj210.w@eY_Yg=Xt~ƝGv_N&o|l;95c2oEz?O?z1:R7ϥWM3I+\/oX<+r.7 ``v>sQ }Ǖdd:SoF*dBݻ+?L k&᯼L+X鯳EI7@mvrx?.7p/w}*ҫQŬecw?Û^R:2Q$xY t BVR,V<2EJk# G[3ܦ Y83 Ow|?}f6{7t9?vǴJ}+!W4_jDdofz(8GӋ?<8C1qI!EQyogSo/>H-Y4*1(k@Q = 8D,ߜf=~W;i\O)庒! x d:OΦ.|6 Eͭ T"dHKU<,i Y|mlzEGZpjB@J0M ~2ʺ ~`/KK2"d*eZjP-D} fy !?zp0qj1 AMd8i GOFYX~u9\/˗f}tZ-.ZT#I#P$a 14tWgY(p7qܧNۍBs,0bE^Z*.BYH=$ay fwlUs->'GSߵtrξCAG"Oтօ* _~,tLzo6Fi'+Y< uF_FnGACkt"G ;kT$9jԂܹ 8aADۀ9EnDl||3H4o??zQPҸ|y'L7VCLczu8)SaIk4 Qw2CΒ6O+d!EԦHmtdŤ04nLЋi]L,A]TBҒz+9&t]#(Bvʁގe>~:km7{Fe>~:k͝RF E{#I >:wwel(?cb;M$f " 6=7s "\tvǸM E\X CHjTCǙ0 3ujޓ."jy)˧M@ ·Oj E$sV4p R@ *4"=f]ěz?K>.@,hep-Y(p;")V*b!n,*R4hhH t?(=s7_.[>㧳6A}lӱ]o{ĸD;L"yŷA\MgV 6PP# Y%[{"Tci;4ХwVҀR&q"]O~D8.RH"!$[(P] ~b..$}Sפ_ŲZ0}A`=*ғ9fy[k=vY y;*I 2Tnj!_aa$HvEK@KnT][֖_\QQX 8j+b[zRVjQ!oRpy,Nj% Mkz[Fu;Z<غ} i݄yx46F'g.;[6TDB*>ZL)-O艛Jk@5p?a٦Ľl;tMVt^xuS$yG}xۄ^x~Os/#*rBYH+Y(Y~?4 .\t;8uBHA; /y^lOgMO v@?)5@azC$[M>&VKͺVl~I`\ urB=V_  le"bk*>n%Bp k5 ♎<6k1R~i8qŽ!_#grsЎB#kERuz J vӬXCJFڤE g]NB& y.Y=^#IJ˜'! aذk.K J@P{iޅTCv]!@@a -rﯦ8bUb$hM3a@Ice!}|"Fx'.6aB`pOu(qB #8I pO>V8!Azg^(FAkt՗'!Hr= Ơhj*U(Ϛ P%֖R.hظqpQp˩RTrvےdX > ڢs{Bs ':bBef 3'Ӹf*a^"87D{qB,B 3j1-q( 7mnr.PPPrEgeqz۵j(PoUY M/5U 'Т{BH/XKiѝ!X~iv~;N,(p7=m\\E9\"rhmD6Hkhǽ??s(6~Bt#›gn@6mwDp`_w м vb|8 ;m}(a=?J$c r.b)ؾLnH+5x;'\ ӲK5`" o=ޥkw|-hՠWiMrw|Àb -꜄t+v6+b^XǟGZGAj|m}tW"L]9rWC`MVXF-.lC>~:/J6GSJ:XEݬҚh'.n}t>bB;I5Z)R.VS~*.)JJBO "YUQzIlOgmpz56V2]>㧳;W;a/(pTR~R:5η!K.J)~Ruf}t0BݓRC~IY?VAtT~Ri%[q|Oga. b/(RGh+Ԙ~R:q(RR/zwqO`I)=8w[JEO>z?}I)==ٽq)=aQgR~̼6?}p/?D>hD,5 sWR2\޵5mk+8z٧3~nL. YҖ$n$-C7JNf[@o-,,`]R( VXJi1QHQHRxBi22 &XgPl0xc IO\ ^.}y OoE~!L֫t927|˿ (1nX eL=%w"$˽mh$]Թk4D]Kզèh0j4)Ìy/; ?M'W_Zwnrk(~ PdT| Z1j&ҋE8o"?mVǞp4!2}3orD~xr_8v`)v _-,>`5ȍ2'OּaD1 $*I(QhyƑb4%bˋQ :4-xom>ӀF_zFC~ڗ,q:n%JrEH0Ci6!Ș^zVE1ŹTSz'b[$;[+a&[q¹':VUmȍ,_QzvWZrUw5(㌨< sb6^#N!D}Y1?*KA6I`βm?߹\Sr"ڣP*9!ޣҘCk^ iՒ|> n9 QȻW{D|1\O!BwF>ŪY})?CHE`c҃2PL OQPt|^A+[Yĩ9VU/9~LaGh}Sˬ3H$S* j4IUd#ΜNwā FS,O,s 0pA:vpF>۴B7:ugC=ސQzMr}dj#dĶj:"" 30Dq 1DI G\Ʃ2/ATpl`tO h _M dN03MaI>ikNe BKASH"X)&9HcckD& ZpjEZ-QRB~^t @cb|e@eEWFqv9UE|>LYg^o)(!P*G6L%LƆO))yTbb0 L DJ*~Q Wv]\{Fsg2TbTk(1ˊ*Q $J)6,&6:Ik;?_7Ɣe YoM\͗rX <Ā!!_As?z;5-0wo_ F/!y]u+̞^|KCN>"O#ߟP",l?P`.1cFuMK_/'ӗ%lw [0 ( 4ewݿ`3xZ$֙S]ҝ”)lX3/JTZ[rq1hL2v #u^Θ/'7:g7w=IC@ Cz[jkT'TRA%-<ͤOX4ƉL$ҬvAq0_]] U t~ҷ {BJȻw\[t wsVq?q|Kt5sR9- 9~ UY!ǫ 9n0N%n d.t^Ы{*ZhW=f޹9g$PwZbu ` [qDkDzlhm*BȾ[vWPּ1FP^Q{]}{Zeqg~:tjS84Y֒APG bR",#WYN[-A:Q|`O%TzJ|ӹWt2~v e~W81&Uf!e+7`+ Qtn;OzdEv8nI"hȮ; un!+^#[m]М4y |56=u氲$!sm$ST:]I1"A5Gtv둧ےkwזf}1+ZGVwjòBיu_TS{1"D@ӛ9{- $mvguz%{&j|x-V%nG'aKڭ]Hv!I;DbyC[* IIQ" ?0C-mcIOpF+9*؆^ [$y2J߇HI%*Sdrڡ0|"!ռǀ 0^am?T4fgLQ1 $Ղbe4N̤z,W׏\eg}C9ኊ~yEyqp;zNRX\˂,o㯜m0hGeNO{՟<%稻,~kv% I?l ボXg8F|]hXkUʒ^X{c!V?Z:砠yA>wu0^MmW[.L 0j _k {ga41TP[Hz:SQIcpX:CBrU NõJd`cp8~jL]3}'dx.u]Tp_&[G 9ז4."ԓ:Kv*OjSDRʓPı#aKX>v*OND~CsԬ Sʴ4+E5M`SC$ĒDRc*7G)wzq (Q_m@vrbA4ՀR=G͇H [4/.y"H,cil(!dQ, 0ZpcP8HF*B6w>\Iyj׺=L@fv>E6-bA~B_nHx~RbBuH a!08寜a=a.4hyesb{HկaN OB^[ xm%ڕ5M~\dbE^;bStRJMqL(V8$QD9GR"p zĞ?oJ^pr˗k[|km֕2k5e YxEr=ӝW9lE2P*sؼh%"G)G}eV{u˛V4sHGطλc蛅Ey0}sK@e U7!Vńzs;N18앺DVw5q?U'ՠ5!΍:9tNm!9WQqȰĘzY=ew`bfb)}dZySya8sLl] 33olv~vf!0s883&hdj'fZ`7{ras4E?o*'{*8rb9Ίt^ÙAdvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005274703115140111236017700 0ustar rootrootFeb 02 10:31:57 crc systemd[1]: Starting Kubernetes Kubelet... Feb 02 10:31:57 crc restorecon[4680]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:57 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:58 crc restorecon[4680]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 02 10:31:59 crc kubenswrapper[4845]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.419062 4845 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423921 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423953 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423963 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423972 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423981 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423990 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.423997 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424009 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424019 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424029 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424038 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424061 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424070 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424079 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424088 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424097 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424106 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424113 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424121 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424130 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424138 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424146 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424154 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424162 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424170 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424177 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424185 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424194 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424202 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424211 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424219 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424228 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424236 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424244 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424252 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424263 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424272 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424281 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424293 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424303 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424311 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424319 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424326 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424334 4845 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424344 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424353 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424361 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424369 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424377 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424384 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424392 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424400 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424407 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424415 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424422 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424433 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424442 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424452 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424461 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424469 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424481 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424489 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424497 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424504 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424512 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424519 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424527 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424535 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424542 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424550 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.424557 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427153 4845 flags.go:64] FLAG: --address="0.0.0.0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427181 4845 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427198 4845 flags.go:64] FLAG: --anonymous-auth="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427209 4845 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427222 4845 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427234 4845 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427245 4845 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427286 4845 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427295 4845 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427304 4845 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427314 4845 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427323 4845 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427333 4845 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427341 4845 flags.go:64] FLAG: --cgroup-root="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427350 4845 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427359 4845 flags.go:64] FLAG: --client-ca-file="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427368 4845 flags.go:64] FLAG: --cloud-config="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427376 4845 flags.go:64] FLAG: --cloud-provider="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427385 4845 flags.go:64] FLAG: --cluster-dns="[]" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427394 4845 flags.go:64] FLAG: --cluster-domain="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427403 4845 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427413 4845 flags.go:64] FLAG: --config-dir="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427421 4845 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427432 4845 flags.go:64] FLAG: --container-log-max-files="5" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427443 4845 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427452 4845 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427461 4845 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427471 4845 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427479 4845 flags.go:64] FLAG: --contention-profiling="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427488 4845 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427497 4845 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427506 4845 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427515 4845 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427525 4845 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427534 4845 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427543 4845 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427552 4845 flags.go:64] FLAG: --enable-load-reader="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427562 4845 flags.go:64] FLAG: --enable-server="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427571 4845 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427582 4845 flags.go:64] FLAG: --event-burst="100" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427593 4845 flags.go:64] FLAG: --event-qps="50" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427602 4845 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427612 4845 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427621 4845 flags.go:64] FLAG: --eviction-hard="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427633 4845 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427642 4845 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427651 4845 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427660 4845 flags.go:64] FLAG: --eviction-soft="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427669 4845 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427677 4845 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427686 4845 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427694 4845 flags.go:64] FLAG: --experimental-mounter-path="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427703 4845 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427711 4845 flags.go:64] FLAG: --fail-swap-on="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427720 4845 flags.go:64] FLAG: --feature-gates="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427731 4845 flags.go:64] FLAG: --file-check-frequency="20s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427740 4845 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427749 4845 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427758 4845 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427767 4845 flags.go:64] FLAG: --healthz-port="10248" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427776 4845 flags.go:64] FLAG: --help="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427785 4845 flags.go:64] FLAG: --hostname-override="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427794 4845 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427803 4845 flags.go:64] FLAG: --http-check-frequency="20s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427812 4845 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427821 4845 flags.go:64] FLAG: --image-credential-provider-config="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427829 4845 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427838 4845 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427847 4845 flags.go:64] FLAG: --image-service-endpoint="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427856 4845 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427865 4845 flags.go:64] FLAG: --kube-api-burst="100" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427874 4845 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427883 4845 flags.go:64] FLAG: --kube-api-qps="50" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427917 4845 flags.go:64] FLAG: --kube-reserved="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427926 4845 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427934 4845 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427944 4845 flags.go:64] FLAG: --kubelet-cgroups="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427953 4845 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427962 4845 flags.go:64] FLAG: --lock-file="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427970 4845 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427979 4845 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.427988 4845 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428000 4845 flags.go:64] FLAG: --log-json-split-stream="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428009 4845 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428018 4845 flags.go:64] FLAG: --log-text-split-stream="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428026 4845 flags.go:64] FLAG: --logging-format="text" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428035 4845 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428045 4845 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428053 4845 flags.go:64] FLAG: --manifest-url="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428062 4845 flags.go:64] FLAG: --manifest-url-header="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428073 4845 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428082 4845 flags.go:64] FLAG: --max-open-files="1000000" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428092 4845 flags.go:64] FLAG: --max-pods="110" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428101 4845 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428110 4845 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428119 4845 flags.go:64] FLAG: --memory-manager-policy="None" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428127 4845 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428136 4845 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428145 4845 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428154 4845 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428173 4845 flags.go:64] FLAG: --node-status-max-images="50" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428182 4845 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428191 4845 flags.go:64] FLAG: --oom-score-adj="-999" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428200 4845 flags.go:64] FLAG: --pod-cidr="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428209 4845 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428233 4845 flags.go:64] FLAG: --pod-manifest-path="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428244 4845 flags.go:64] FLAG: --pod-max-pids="-1" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428255 4845 flags.go:64] FLAG: --pods-per-core="0" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428266 4845 flags.go:64] FLAG: --port="10250" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428277 4845 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428286 4845 flags.go:64] FLAG: --provider-id="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428294 4845 flags.go:64] FLAG: --qos-reserved="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428304 4845 flags.go:64] FLAG: --read-only-port="10255" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428312 4845 flags.go:64] FLAG: --register-node="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428322 4845 flags.go:64] FLAG: --register-schedulable="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428331 4845 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428345 4845 flags.go:64] FLAG: --registry-burst="10" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428353 4845 flags.go:64] FLAG: --registry-qps="5" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428362 4845 flags.go:64] FLAG: --reserved-cpus="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428372 4845 flags.go:64] FLAG: --reserved-memory="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428382 4845 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428391 4845 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428400 4845 flags.go:64] FLAG: --rotate-certificates="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428408 4845 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428417 4845 flags.go:64] FLAG: --runonce="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428426 4845 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428435 4845 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428444 4845 flags.go:64] FLAG: --seccomp-default="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428452 4845 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428461 4845 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428470 4845 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428479 4845 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428489 4845 flags.go:64] FLAG: --storage-driver-password="root" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428497 4845 flags.go:64] FLAG: --storage-driver-secure="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428506 4845 flags.go:64] FLAG: --storage-driver-table="stats" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428515 4845 flags.go:64] FLAG: --storage-driver-user="root" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428524 4845 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428533 4845 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428542 4845 flags.go:64] FLAG: --system-cgroups="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428550 4845 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428564 4845 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428572 4845 flags.go:64] FLAG: --tls-cert-file="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428581 4845 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428590 4845 flags.go:64] FLAG: --tls-min-version="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428599 4845 flags.go:64] FLAG: --tls-private-key-file="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428608 4845 flags.go:64] FLAG: --topology-manager-policy="none" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428616 4845 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428625 4845 flags.go:64] FLAG: --topology-manager-scope="container" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428635 4845 flags.go:64] FLAG: --v="2" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428646 4845 flags.go:64] FLAG: --version="false" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428658 4845 flags.go:64] FLAG: --vmodule="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428669 4845 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.428678 4845 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428877 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428911 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428920 4845 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428928 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428936 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428944 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428952 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428960 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428969 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428977 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428986 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.428994 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429002 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429018 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429026 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429034 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429042 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429049 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429057 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429065 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429072 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429080 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429088 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429096 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429104 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429111 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429119 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429126 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429136 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429145 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429154 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429170 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429180 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429188 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429197 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429205 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429213 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429221 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429228 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429236 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429244 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429251 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429259 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429266 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429276 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429289 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429298 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429306 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429316 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429326 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429337 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429346 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429355 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429364 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429372 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429380 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429387 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429395 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429402 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429410 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429418 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429425 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429433 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429440 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429448 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429456 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429464 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429473 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429480 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429488 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.429498 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.430849 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.444424 4845 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.444487 4845 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444610 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444624 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444631 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444637 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444644 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444650 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444658 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444666 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444673 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444681 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444689 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444695 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444701 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444708 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444713 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444718 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444723 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444728 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444733 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444740 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444748 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444754 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444760 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444767 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444772 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444779 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444783 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444789 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444795 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444801 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444807 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444812 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444819 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444825 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444832 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444839 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444846 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444852 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444858 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444864 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444871 4845 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444878 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444910 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444916 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444922 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444930 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444936 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444942 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444947 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444952 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444958 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444963 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444969 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444975 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444980 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444985 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444990 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.444995 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445000 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445005 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445010 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445015 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445022 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445030 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445036 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445042 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445046 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445051 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445057 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445064 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445070 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.445080 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445263 4845 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445275 4845 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445284 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445291 4845 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445297 4845 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445303 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445308 4845 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445314 4845 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445320 4845 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445328 4845 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445333 4845 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445339 4845 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445345 4845 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445350 4845 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445358 4845 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445363 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445368 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445373 4845 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445378 4845 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445384 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445389 4845 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445394 4845 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445399 4845 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445405 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445411 4845 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445416 4845 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445422 4845 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445427 4845 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445432 4845 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445438 4845 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445444 4845 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445449 4845 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445454 4845 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445461 4845 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445467 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445473 4845 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445478 4845 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445484 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445490 4845 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445495 4845 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445503 4845 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445508 4845 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445515 4845 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445520 4845 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445526 4845 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445532 4845 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445538 4845 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445543 4845 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445548 4845 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445553 4845 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445558 4845 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445563 4845 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445569 4845 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445574 4845 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445579 4845 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445585 4845 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445590 4845 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445595 4845 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445600 4845 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445605 4845 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445610 4845 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445616 4845 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445621 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445627 4845 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445632 4845 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445637 4845 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445644 4845 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445650 4845 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445656 4845 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445661 4845 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.445667 4845 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.445676 4845 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.445938 4845 server.go:940] "Client rotation is on, will bootstrap in background" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.450999 4845 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.451117 4845 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.453127 4845 server.go:997] "Starting client certificate rotation" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.453166 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.453398 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-09 18:53:15.651154807 +0000 UTC Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.453574 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.488660 4845 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.488713 4845 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.491302 4845 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.509984 4845 log.go:25] "Validated CRI v1 runtime API" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.553529 4845 log.go:25] "Validated CRI v1 image API" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.556553 4845 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.565089 4845 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-02-10-27-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.565156 4845 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.596525 4845 manager.go:217] Machine: {Timestamp:2026-02-02 10:31:59.592541207 +0000 UTC m=+0.683942737 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a0f7ad40-dfc7-4c48-b08f-9dc9799ca728 BootID:8f18ce78-9cc3-4dbd-9d49-5987790a156d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:20:07:db Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:20:07:db Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d9:70:55 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:9c:d1:3f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:d2:9e:7c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f5:c4:55 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:ec:c3:07:d8:6f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:4e:33:a6:ce:a0:af Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.597001 4845 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.597282 4845 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.599394 4845 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.599745 4845 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.599811 4845 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.600253 4845 topology_manager.go:138] "Creating topology manager with none policy" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.600276 4845 container_manager_linux.go:303] "Creating device plugin manager" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.601014 4845 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.601072 4845 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.602203 4845 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.602358 4845 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.607733 4845 kubelet.go:418] "Attempting to sync node with API server" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.607773 4845 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.607815 4845 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.607839 4845 kubelet.go:324] "Adding apiserver pod source" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.607860 4845 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.613569 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.613683 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.613683 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.613770 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.617292 4845 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.618490 4845 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.620242 4845 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622087 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622131 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622147 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622163 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622186 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622201 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622215 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622237 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622252 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622267 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622286 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.622300 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.623332 4845 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.624032 4845 server.go:1280] "Started kubelet" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.625143 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.625250 4845 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.625982 4845 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.626341 4845 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 02 10:31:59 crc systemd[1]: Started Kubernetes Kubelet. Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.627558 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.627608 4845 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.627852 4845 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.627936 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 21:03:27.718556966 +0000 UTC Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.628170 4845 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.628191 4845 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.628241 4845 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.629030 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.629165 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.629281 4845 factory.go:55] Registering systemd factory Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.629321 4845 factory.go:221] Registration of the systemd container factory successfully Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.630963 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.631190 4845 factory.go:153] Registering CRI-O factory Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.631226 4845 factory.go:221] Registration of the crio container factory successfully Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.631371 4845 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.631420 4845 factory.go:103] Registering Raw factory Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.631453 4845 manager.go:1196] Started watching for new ooms in manager Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.636858 4845 server.go:460] "Adding debug handlers to kubelet server" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.636300 4845 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18906760f191cc19 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:31:59.623990297 +0000 UTC m=+0.715391777,LastTimestamp:2026-02-02 10:31:59.623990297 +0000 UTC m=+0.715391777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.638620 4845 manager.go:319] Starting recovery of all containers Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.654622 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655242 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655269 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655291 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655310 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655329 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655349 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655377 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655402 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655419 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655436 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655454 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655472 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655500 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655519 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655546 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655601 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655623 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655641 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655661 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655678 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655697 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655716 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655733 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655750 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655772 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655795 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655816 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655874 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655924 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.655971 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656013 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656040 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656066 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656086 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656106 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656124 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656146 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656168 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656186 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656208 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656229 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656249 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656269 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656291 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656311 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656332 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656354 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656376 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656396 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656415 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656435 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656463 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656483 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656508 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656528 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656548 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656569 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656589 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656611 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656631 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656651 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656671 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656693 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656712 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656733 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656752 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656774 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656793 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656814 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656835 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656857 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656878 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656927 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656946 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656966 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.656987 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657013 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657043 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657073 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657095 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657115 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657136 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657158 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657178 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657199 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657223 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657243 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657261 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657281 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657299 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657318 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657337 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657356 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657374 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657396 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657416 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657435 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657454 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657473 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657493 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657514 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657533 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657553 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657585 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657607 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657627 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657649 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657671 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657691 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.657714 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661184 4845 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661245 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661271 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661293 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661316 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661339 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661361 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661384 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661406 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661428 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661450 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661473 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661495 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661516 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661537 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661561 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661587 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661606 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661627 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661647 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661691 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661712 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661732 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661753 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661772 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661792 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661811 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661832 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661853 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661873 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661921 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.661943 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662009 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662036 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662064 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662086 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662108 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662129 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662148 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662168 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662190 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662208 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662227 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662249 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662270 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662291 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662311 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662330 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662354 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662378 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662400 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662419 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662442 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662462 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662483 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662502 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662522 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662544 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662566 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662587 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662607 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662626 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662644 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662666 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662688 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662706 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662728 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662746 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662766 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662787 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662806 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662828 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662848 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662867 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662919 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662940 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662962 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.662980 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663000 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663023 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663054 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663074 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663094 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663112 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663130 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663149 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663166 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663190 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663209 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663234 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663254 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663276 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663297 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663317 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663334 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663353 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663377 4845 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663395 4845 reconstruct.go:97] "Volume reconstruction finished" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.663408 4845 reconciler.go:26] "Reconciler: start to sync state" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.666745 4845 manager.go:324] Recovery completed Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.678137 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.681608 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.681649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.681663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.684271 4845 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.684300 4845 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.684329 4845 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.702899 4845 policy_none.go:49] "None policy: Start" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.704406 4845 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.704443 4845 state_mem.go:35] "Initializing new in-memory state store" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.706317 4845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.711256 4845 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.711308 4845 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.711345 4845 kubelet.go:2335] "Starting kubelet main sync loop" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.711408 4845 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 02 10:31:59 crc kubenswrapper[4845]: W0202 10:31:59.714022 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.714179 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.728744 4845 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.764528 4845 manager.go:334] "Starting Device Plugin manager" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.765004 4845 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.765051 4845 server.go:79] "Starting device plugin registration server" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.765975 4845 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.766038 4845 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.766289 4845 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.766549 4845 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.766585 4845 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.777609 4845 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.812060 4845 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.812227 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.813691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.813773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.813798 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.814120 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.814485 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.815099 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.818632 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.818671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.818700 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.819827 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.819830 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.820019 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.820070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.820140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.820164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821566 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821691 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.821733 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.822735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.822786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.822803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823175 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823392 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.823796 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.824697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.824770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.824799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.825035 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.825071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.825089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.825213 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.825277 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.826656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.826701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.826719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.832674 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865571 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865636 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865676 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865871 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.865966 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866026 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866150 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866188 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866237 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866263 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866299 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866323 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.866358 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.867948 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.870087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.870131 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.870150 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.870189 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:59 crc kubenswrapper[4845]: E0202 10:31:59.870635 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.967934 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968287 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968510 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968679 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968873 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968623 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968794 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.968954 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969071 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969251 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969286 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969331 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969353 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969380 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969408 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969386 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969413 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969466 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969510 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969488 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969490 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969580 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969618 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969633 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:59 crc kubenswrapper[4845]: I0202 10:31:59.969685 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.071026 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.073391 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.073670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.073865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.074080 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.075218 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.162462 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.175571 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.202438 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.229756 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.233933 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.237563 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-73c9a9f0b2b452aab98efd95c7ca2d32f9daab3ea016c7b91f0a9f15ea54e21d WatchSource:0}: Error finding container 73c9a9f0b2b452aab98efd95c7ca2d32f9daab3ea016c7b91f0a9f15ea54e21d: Status 404 returned error can't find the container with id 73c9a9f0b2b452aab98efd95c7ca2d32f9daab3ea016c7b91f0a9f15ea54e21d Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.237682 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.244521 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-979577b20829b0ebe0549162765f5b2ce2677fa842bd6af8dee6ff0913c51afb WatchSource:0}: Error finding container 979577b20829b0ebe0549162765f5b2ce2677fa842bd6af8dee6ff0913c51afb: Status 404 returned error can't find the container with id 979577b20829b0ebe0549162765f5b2ce2677fa842bd6af8dee6ff0913c51afb Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.253971 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-8ba7686ff8429e7de1af9d494078bbd5f7cf03a9e39eb4e971c7672025f08a20 WatchSource:0}: Error finding container 8ba7686ff8429e7de1af9d494078bbd5f7cf03a9e39eb4e971c7672025f08a20: Status 404 returned error can't find the container with id 8ba7686ff8429e7de1af9d494078bbd5f7cf03a9e39eb4e971c7672025f08a20 Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.255521 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-48242bb7d1e237c901773838d3968989bd23340651806a5a4bd94e70ce5eb3b9 WatchSource:0}: Error finding container 48242bb7d1e237c901773838d3968989bd23340651806a5a4bd94e70ce5eb3b9: Status 404 returned error can't find the container with id 48242bb7d1e237c901773838d3968989bd23340651806a5a4bd94e70ce5eb3b9 Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.264237 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-aedb1aab9603712abe233d308c0e51e40d2fb6e3bffd516b40b4ba182d1bb429 WatchSource:0}: Error finding container aedb1aab9603712abe233d308c0e51e40d2fb6e3bffd516b40b4ba182d1bb429: Status 404 returned error can't find the container with id aedb1aab9603712abe233d308c0e51e40d2fb6e3bffd516b40b4ba182d1bb429 Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.476118 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.478631 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.478706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.478726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.478776 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.479593 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.528019 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.528132 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.621072 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.621264 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.627193 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.629055 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 23:51:55.856985181 +0000 UTC Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.719915 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aedb1aab9603712abe233d308c0e51e40d2fb6e3bffd516b40b4ba182d1bb429"} Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.721178 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"48242bb7d1e237c901773838d3968989bd23340651806a5a4bd94e70ce5eb3b9"} Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.722520 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8ba7686ff8429e7de1af9d494078bbd5f7cf03a9e39eb4e971c7672025f08a20"} Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.724751 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"979577b20829b0ebe0549162765f5b2ce2677fa842bd6af8dee6ff0913c51afb"} Feb 02 10:32:00 crc kubenswrapper[4845]: I0202 10:32:00.726590 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"73c9a9f0b2b452aab98efd95c7ca2d32f9daab3ea016c7b91f0a9f15ea54e21d"} Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.736771 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.736877 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:00 crc kubenswrapper[4845]: W0202 10:32:00.908817 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:00 crc kubenswrapper[4845]: E0202 10:32:00.908966 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:01 crc kubenswrapper[4845]: E0202 10:32:01.035127 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.280378 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.282190 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.282259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.282279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.282341 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:01 crc kubenswrapper[4845]: E0202 10:32:01.283090 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.614765 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:32:01 crc kubenswrapper[4845]: E0202 10:32:01.616787 4845 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.627556 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.629780 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 06:22:58.267556801 +0000 UTC Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.732678 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850" exitCode=0 Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.732780 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.732849 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.734316 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.734390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.734413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.735513 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676" exitCode=0 Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.735587 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.735625 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.737064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.737124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.737149 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.738347 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.742921 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.742999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.743028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.743315 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.743195 4845 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb" exitCode=0 Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.743830 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.744518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.744552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.744572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.749168 4845 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20" exitCode=0 Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.749360 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.749430 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.750719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.750753 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.750771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.755251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.755345 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.755382 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.755408 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf"} Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.755585 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.757181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.757256 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4845]: I0202 10:32:01.757284 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.628075 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.630412 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 13:57:46.726250234 +0000 UTC Feb 02 10:32:02 crc kubenswrapper[4845]: E0202 10:32:02.636085 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.763253 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.763316 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.763328 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.763337 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.767295 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1" exitCode=0 Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.767394 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.767485 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.768679 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.768728 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.768739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.770455 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.771311 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.772582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.772637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.772653 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.775428 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.775512 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.775579 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.775595 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56"} Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.775609 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.776909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.776948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.776960 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.777318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.777346 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.777355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: W0202 10:32:02.792471 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:02 crc kubenswrapper[4845]: E0202 10:32:02.792571 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:02 crc kubenswrapper[4845]: W0202 10:32:02.812319 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:02 crc kubenswrapper[4845]: E0202 10:32:02.812437 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.884040 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.885405 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.885442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.885454 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4845]: I0202 10:32:02.885479 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:02 crc kubenswrapper[4845]: E0202 10:32:02.886402 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.224:6443: connect: connection refused" node="crc" Feb 02 10:32:02 crc kubenswrapper[4845]: W0202 10:32:02.937813 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:02 crc kubenswrapper[4845]: E0202 10:32:02.937930 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.001283 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:32:03 crc kubenswrapper[4845]: W0202 10:32:03.122284 4845 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.224:6443: connect: connection refused Feb 02 10:32:03 crc kubenswrapper[4845]: E0202 10:32:03.122413 4845 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.224:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.631092 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 18:22:33.666920729 +0000 UTC Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.784194 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b"} Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.784410 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.785807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.785864 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.785874 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.788059 4845 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3" exitCode=0 Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.788196 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.788203 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.788702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3"} Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.788852 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789569 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.789940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.790562 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.790599 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4845]: I0202 10:32:03.790609 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.178523 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.178820 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.180982 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.181139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.181164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.631586 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 21:10:28.851185349 +0000 UTC Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798288 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd"} Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798358 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209"} Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798386 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc"} Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798434 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798540 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.798457 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800233 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4845]: I0202 10:32:04.800277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.631950 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 15:06:40.870970435 +0000 UTC Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.745953 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.807030 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0"} Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.807101 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267"} Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.807314 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.809053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.809123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4845]: I0202 10:32:05.809142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.086540 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.088258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.088320 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.088343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.088383 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.134332 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.134541 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.134612 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.136768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.136814 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.136837 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.580182 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.596635 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.596937 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.599137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.599213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.599229 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.632596 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 05:03:23.952659748 +0000 UTC Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.639881 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.810976 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.811055 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.811119 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.812735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.812774 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.812791 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.813053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.813144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4845]: I0202 10:32:06.813167 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.333552 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.333843 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.335515 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.335586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.335611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.633780 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 08:56:05.700791281 +0000 UTC Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.813630 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.815006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.815103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.815124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.919633 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.919943 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.921420 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.921495 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4845]: I0202 10:32:07.921516 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4845]: I0202 10:32:08.634586 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 08:30:52.36641614 +0000 UTC Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.230588 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.230950 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.232738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.232807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.232832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.237586 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.597670 4845 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.597770 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.635149 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 03:33:38.290489499 +0000 UTC Feb 02 10:32:09 crc kubenswrapper[4845]: E0202 10:32:09.777856 4845 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.818292 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.819960 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.820021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4845]: I0202 10:32:09.820041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4845]: I0202 10:32:10.635779 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:06:41.749009381 +0000 UTC Feb 02 10:32:11 crc kubenswrapper[4845]: I0202 10:32:11.636847 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:59:04.427357989 +0000 UTC Feb 02 10:32:12 crc kubenswrapper[4845]: I0202 10:32:12.637680 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 00:03:39.003811264 +0000 UTC Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.494217 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.494443 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.495668 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.495714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.495728 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.626924 4845 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.638392 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 08:24:43.774727274 +0000 UTC Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.831446 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.834176 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b" exitCode=255 Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.834237 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b"} Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.834434 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.835445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.835508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.835533 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4845]: I0202 10:32:13.836453 4845 scope.go:117] "RemoveContainer" containerID="74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.185602 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.185822 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.187375 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.187436 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.187457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.519519 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.519584 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.527403 4845 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.527471 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.639383 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 04:54:57.328252696 +0000 UTC Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.840559 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.843107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5"} Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.843283 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.844511 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.844575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4845]: I0202 10:32:14.844595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4845]: I0202 10:32:15.639995 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:54:12.263585485 +0000 UTC Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.143871 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.144165 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.144274 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.145746 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.145792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.145804 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.156207 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.640537 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 11:19:59.787855823 +0000 UTC Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.850857 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.852564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.852658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4845]: I0202 10:32:16.852692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4845]: I0202 10:32:17.680008 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 11:30:05.292821977 +0000 UTC Feb 02 10:32:17 crc kubenswrapper[4845]: I0202 10:32:17.853871 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:17 crc kubenswrapper[4845]: I0202 10:32:17.855359 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4845]: I0202 10:32:17.855627 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4845]: I0202 10:32:17.855655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4845]: I0202 10:32:18.680578 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:15:07.935169359 +0000 UTC Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.520302 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.522688 4845 trace.go:236] Trace[685841295]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:32:06.910) (total time: 12612ms): Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[685841295]: ---"Objects listed" error: 12612ms (10:32:19.522) Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[685841295]: [12.612426617s] [12.612426617s] END Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.522709 4845 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.522839 4845 trace.go:236] Trace[792374462]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:32:08.602) (total time: 10920ms): Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[792374462]: ---"Objects listed" error: 10920ms (10:32:19.522) Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[792374462]: [10.920455954s] [10.920455954s] END Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.522912 4845 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.523358 4845 trace.go:236] Trace[1986920572]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:32:08.432) (total time: 11090ms): Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[1986920572]: ---"Objects listed" error: 11090ms (10:32:19.523) Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[1986920572]: [11.090824877s] [11.090824877s] END Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.523378 4845 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.525052 4845 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.525203 4845 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.527013 4845 trace.go:236] Trace[1290163049]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:32:08.771) (total time: 10755ms): Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[1290163049]: ---"Objects listed" error: 10755ms (10:32:19.526) Feb 02 10:32:19 crc kubenswrapper[4845]: Trace[1290163049]: [10.755385096s] [10.755385096s] END Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.527059 4845 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.534607 4845 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.560931 4845 csr.go:261] certificate signing request csr-npg7j is approved, waiting to be issued Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.576042 4845 csr.go:257] certificate signing request csr-npg7j is issued Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.597690 4845 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.597794 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.620570 4845 apiserver.go:52] "Watching apiserver" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.624133 4845 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.624436 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.624919 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.624968 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.625022 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.625169 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.625243 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.625829 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.625860 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.625929 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.625951 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.626379 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.627643 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.628054 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.628186 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.628287 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.628721 4845 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.629000 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.629159 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.629761 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.630188 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.652326 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.669534 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.681558 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 18:50:22.632698342 +0000 UTC Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.682078 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.692918 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.704029 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-rzb6b"] Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.704347 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.704943 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.706647 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.709623 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.709668 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.721449 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726488 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726529 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726551 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726574 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726593 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726614 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726635 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726655 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726676 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726696 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726717 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726737 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726759 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726804 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726825 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726845 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726914 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726937 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726958 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726979 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.726987 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727000 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727061 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727081 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727099 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727118 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727133 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727150 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727169 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727189 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727206 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727222 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727240 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727255 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727277 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727294 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727315 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727333 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727349 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727386 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727407 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727428 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727448 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727469 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727492 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727517 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727547 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727568 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727591 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727611 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727636 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727653 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727669 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727692 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727715 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727765 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727828 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727843 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727874 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727915 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727932 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727951 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727967 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727983 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728000 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728019 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728037 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728071 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728113 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728130 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728146 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728161 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728179 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728195 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728213 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728231 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728246 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728262 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728280 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728297 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728312 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728329 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728347 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728381 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728399 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728431 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728447 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728463 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728482 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728498 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728513 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728530 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728544 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728561 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728611 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728630 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728644 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728661 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728677 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728694 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728716 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728733 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728752 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728770 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728787 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728803 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728820 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728836 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728851 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728914 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728940 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728965 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728986 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729009 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729034 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729061 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729084 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729106 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729131 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729155 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729180 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729203 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729229 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729251 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729274 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729295 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729322 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729345 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729369 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729396 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729421 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729445 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729470 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729494 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729518 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729542 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729568 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729589 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729612 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729636 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729659 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729681 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729709 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729736 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729761 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729783 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729808 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729831 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729854 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729874 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730017 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730071 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730094 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730117 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730148 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730171 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730194 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730216 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730243 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730267 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730293 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730316 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730343 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730369 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730393 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730413 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730438 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730463 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730490 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730513 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730536 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730559 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730579 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730600 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730625 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730646 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730669 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730691 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730714 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730737 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730761 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730898 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730934 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730966 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730994 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731018 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731106 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731139 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731172 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731202 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731229 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731254 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731311 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731345 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731372 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731399 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731426 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731454 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731482 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731553 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.727290 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728372 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728709 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.728641 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732964 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729025 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729041 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729248 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729296 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729351 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729659 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.729676 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730222 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730439 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730448 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730531 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730549 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.730819 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731288 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731434 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731521 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731535 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.731754 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732188 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733096 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732430 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732557 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732835 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733234 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732912 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.732974 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733005 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733054 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733554 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733625 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.733936 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.734081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.734108 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.734221 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735315 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735597 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735817 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735839 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736038 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736066 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736404 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736516 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736874 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736949 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.736997 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.737135 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.737269 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.737298 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.738277 4845 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.738964 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.739380 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.744351 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.735835 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.744563 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.746925 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:20.246842876 +0000 UTC m=+21.338244426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.748317 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.748424 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.748513 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:20.248481984 +0000 UTC m=+21.339883474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.755059 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:20.255019964 +0000 UTC m=+21.346421494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.759089 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.759122 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.759142 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.759218 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:20.259194206 +0000 UTC m=+21.350595746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.768364 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.779223 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.779270 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.779290 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:19 crc kubenswrapper[4845]: E0202 10:32:19.779372 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:20.279346834 +0000 UTC m=+21.370748294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.784083 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.784279 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.785104 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.785341 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.785361 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.786964 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787223 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787302 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787340 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787600 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787814 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787977 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.787995 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.788158 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.788249 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.788366 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.788616 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.788689 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.789029 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.789217 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.789726 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.794447 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.794655 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.794668 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.794839 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795042 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795213 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795308 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795382 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795401 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795587 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.795861 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.796098 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.796188 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.799108 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.806196 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.806211 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.806383 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.808472 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.808752 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.808938 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.810351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.810254 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.812316 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.812779 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813035 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813075 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813238 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813383 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813427 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813534 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.813690 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.814013 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.814102 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.814672 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.814881 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.815235 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.815454 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.816560 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.818408 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.819696 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.819727 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.820494 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.820527 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.820599 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.820709 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.820789 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821092 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821147 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821168 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821462 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.821467 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.822347 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.824316 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825331 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825929 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825969 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.826075 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.825207 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.826797 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.827140 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.827213 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.827822 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.828293 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.831408 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.833409 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.833945 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b94e8620-d850-4036-b311-42b2a6369c73-hosts-file\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.833991 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szs4\" (UniqueName: \"kubernetes.io/projected/b94e8620-d850-4036-b311-42b2a6369c73-kube-api-access-5szs4\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834028 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834096 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834163 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834184 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834205 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834220 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834233 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834247 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834261 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834274 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834286 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834298 4845 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834293 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834310 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834366 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834386 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834426 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834463 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834510 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834542 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834560 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834577 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834595 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834627 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834659 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834691 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834708 4845 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834743 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834776 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834809 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834842 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834859 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834966 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.834991 4845 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835009 4845 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835027 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835061 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835100 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835157 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835191 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835226 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835244 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835261 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835308 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835327 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835344 4845 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835379 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835399 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835445 4845 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835477 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835509 4845 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835527 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835560 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835593 4845 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835625 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835657 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835704 4845 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835880 4845 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835920 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835939 4845 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835951 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.835971 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.836766 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837006 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837215 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837277 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837462 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837490 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837789 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.837939 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838191 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838281 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838300 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838340 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838384 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838445 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838479 4845 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838500 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838548 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838593 4845 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838638 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838671 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838690 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838739 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838772 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838832 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838864 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838883 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.838957 4845 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839004 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839051 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839085 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839103 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839163 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839196 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839242 4845 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839278 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839297 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839343 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839376 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839409 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839453 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839484 4845 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839502 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839534 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839564 4845 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839596 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839654 4845 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839687 4845 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839705 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839723 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839741 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839757 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839774 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839790 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839809 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839825 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839842 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839859 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839876 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839915 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839932 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839949 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839966 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.839983 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840000 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840016 4845 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840033 4845 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840050 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840067 4845 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840084 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840104 4845 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840122 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840138 4845 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840155 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840173 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840191 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840207 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840225 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840242 4845 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840261 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840277 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840294 4845 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840312 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840329 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840346 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840363 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840380 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840396 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840413 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840430 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840446 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.840463 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.841139 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.841197 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.841410 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.842332 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.842807 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843196 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843456 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843643 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843688 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843939 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.843973 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.844999 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.845097 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.845452 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.845552 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.845707 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.846526 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.848009 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.848376 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.851378 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.851546 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.851620 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.852799 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.853131 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.853243 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.853626 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.853731 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.853814 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.854544 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.854836 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.855037 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.855199 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.856377 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.856425 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.856571 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.856876 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.857081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.857116 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.857371 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.857477 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.858071 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.858238 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.859092 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.861960 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.876099 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.886685 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.889540 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.896438 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.898570 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.898712 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.915456 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.924026 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.935202 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941291 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b94e8620-d850-4036-b311-42b2a6369c73-hosts-file\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941336 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szs4\" (UniqueName: \"kubernetes.io/projected/b94e8620-d850-4036-b311-42b2a6369c73-kube-api-access-5szs4\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941418 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941433 4845 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941446 4845 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941458 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941469 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941481 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941493 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941505 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941517 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941529 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941541 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941552 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941564 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941576 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941588 4845 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941601 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941612 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941624 4845 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941636 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941648 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941661 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941674 4845 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941686 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941697 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941709 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941720 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941732 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941744 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941757 4845 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941768 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941780 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941791 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941802 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941814 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941825 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941835 4845 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941850 4845 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941860 4845 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941872 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941901 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941913 4845 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941925 4845 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941939 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941951 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941963 4845 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941975 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941989 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942005 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942022 4845 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942034 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942046 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942057 4845 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942068 4845 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942085 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942097 4845 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942110 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.942122 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.941472 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b94e8620-d850-4036-b311-42b2a6369c73-hosts-file\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.945003 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.953924 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.961952 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.969697 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.973086 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.974714 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szs4\" (UniqueName: \"kubernetes.io/projected/b94e8620-d850-4036-b311-42b2a6369c73-kube-api-access-5szs4\") pod \"node-resolver-rzb6b\" (UID: \"b94e8620-d850-4036-b311-42b2a6369c73\") " pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:19 crc kubenswrapper[4845]: I0202 10:32:19.986184 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.021821 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.096038 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rzb6b" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.347511 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.347589 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.347619 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.347643 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.347662 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347748 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347802 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.347787396 +0000 UTC m=+22.439188846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347855 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.347848737 +0000 UTC m=+22.439250187 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347944 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347956 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347968 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.347990 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.347984411 +0000 UTC m=+22.439385861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348036 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348056 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.348050983 +0000 UTC m=+22.439452433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348094 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348104 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348111 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.348131 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.348126275 +0000 UTC m=+22.439527725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.578225 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-02 10:27:19 +0000 UTC, rotation deadline is 2026-12-07 15:58:08.103272923 +0000 UTC Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.578305 4845 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7397h25m47.524970169s for next certificate rotation Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.682146 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 13:48:43.166575541 +0000 UTC Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.868294 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.868371 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.868388 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0aee212c36ea4834859a48b48df56eef6b94d9bbeee9e5f558f84bfa2387796c"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.869980 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.870472 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.872079 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" exitCode=255 Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.872155 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.872271 4845 scope.go:117] "RemoveContainer" containerID="74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.873624 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.873702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"25e9942ce260cd8fefd4586cbb1b3fc34396b829d3f3478a317c741d0d9f8437"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.876136 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rzb6b" event={"ID":"b94e8620-d850-4036-b311-42b2a6369c73","Type":"ContainerStarted","Data":"56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.876181 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rzb6b" event={"ID":"b94e8620-d850-4036-b311-42b2a6369c73","Type":"ContainerStarted","Data":"23fef1dedd5b078b0e6fbaf6ec0c05fc50fc69097e834f39c8e5e754b2f46515"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.877462 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"30aa69b7e93840f6f33b7d4a7200981d9e044a7123cb74614023a21365973090"} Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.895567 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.920051 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.944421 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.962812 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.977132 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.986531 4845 scope.go:117] "RemoveContainer" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.986588 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:32:20 crc kubenswrapper[4845]: E0202 10:32:20.986792 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:32:20 crc kubenswrapper[4845]: I0202 10:32:20.993721 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.007714 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.018983 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.030067 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.043117 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:13Z\\\",\\\"message\\\":\\\"W0202 10:32:02.975508 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:32:02.975920 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028322 cert, and key in /tmp/serving-cert-231171005/serving-signer.crt, /tmp/serving-cert-231171005/serving-signer.key\\\\nI0202 10:32:03.164665 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:32:03.171077 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:32:03.171332 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:03.176019 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-231171005/tls.crt::/tmp/serving-cert-231171005/tls.key\\\\\\\"\\\\nF0202 10:32:13.433273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.056812 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.069929 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.081352 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.096600 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.109261 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356121 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356217 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356212 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-thbz4"] Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356333 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356353 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356363 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356370 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356237 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356391 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:23.356343942 +0000 UTC m=+24.447745392 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356407 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356455 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356471 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356444 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:23.356433854 +0000 UTC m=+24.447835304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356590 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:23.356562478 +0000 UTC m=+24.447963938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.356572 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356651 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356670 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:23.356660011 +0000 UTC m=+24.448061461 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.356690 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:23.356681941 +0000 UTC m=+24.448083391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.357042 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.357779 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kzwst"] Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.358309 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2wnn9"] Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.358468 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.359016 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.360441 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.361038 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.361161 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.362871 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.364424 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.364477 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.364552 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.364556 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.364941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.365717 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.366237 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.371435 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.387359 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:13Z\\\",\\\"message\\\":\\\"W0202 10:32:02.975508 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:32:02.975920 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028322 cert, and key in /tmp/serving-cert-231171005/serving-signer.crt, /tmp/serving-cert-231171005/serving-signer.key\\\\nI0202 10:32:03.164665 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:32:03.171077 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:32:03.171332 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:03.176019 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-231171005/tls.crt::/tmp/serving-cert-231171005/tls.key\\\\\\\"\\\\nF0202 10:32:13.433273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.408876 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.440847 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457206 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-bin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457228 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-system-cni-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457335 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebf2f253-531f-4835-84c1-928680352f7f-proxy-tls\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457441 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cni-binary-copy\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457469 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-multus\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457507 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-multus-certs\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457537 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-kubelet\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457594 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-conf-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457643 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457709 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-daemon-config\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457764 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebf2f253-531f-4835-84c1-928680352f7f-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457792 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cnibin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457814 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebf2f253-531f-4835-84c1-928680352f7f-rootfs\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457834 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h9c\" (UniqueName: \"kubernetes.io/projected/01f1334f-a21c-4487-a1d6-dbecf7017c59-kube-api-access-k9h9c\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457896 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-k8s-cni-cncf-io\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457928 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-binary-copy\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.457963 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-socket-dir-parent\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-hostroot\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458028 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-etc-kubernetes\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458058 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-cnibin\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458085 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-os-release\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458103 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8c6\" (UniqueName: \"kubernetes.io/projected/ebf2f253-531f-4835-84c1-928680352f7f-kube-api-access-jc8c6\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458120 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-os-release\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-system-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458167 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-netns\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.458183 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ptl6\" (UniqueName: \"kubernetes.io/projected/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-kube-api-access-4ptl6\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.459981 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.473403 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.487181 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.501261 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.514087 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.524785 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.535999 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.550332 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.559741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560089 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-system-cni-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560199 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-bin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cni-binary-copy\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560444 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-multus\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560557 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-multus-certs\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560693 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebf2f253-531f-4835-84c1-928680352f7f-proxy-tls\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-kubelet\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560972 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-conf-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561100 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561217 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561320 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-daemon-config\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561437 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cnibin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561554 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebf2f253-531f-4835-84c1-928680352f7f-rootfs\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561694 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebf2f253-531f-4835-84c1-928680352f7f-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-k8s-cni-cncf-io\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561970 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-binary-copy\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562110 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h9c\" (UniqueName: \"kubernetes.io/projected/01f1334f-a21c-4487-a1d6-dbecf7017c59-kube-api-access-k9h9c\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-hostroot\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562791 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-etc-kubernetes\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562823 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-socket-dir-parent\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562843 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-cnibin\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562842 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-hostroot\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-os-release\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561635 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ebf2f253-531f-4835-84c1-928680352f7f-rootfs\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562913 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8c6\" (UniqueName: \"kubernetes.io/projected/ebf2f253-531f-4835-84c1-928680352f7f-kube-api-access-jc8c6\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-conf-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562934 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-os-release\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-netns\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562973 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ptl6\" (UniqueName: \"kubernetes.io/projected/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-kube-api-access-4ptl6\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-cnibin\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561438 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.562995 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-system-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563025 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-etc-kubernetes\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560480 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-multus\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560965 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-kubelet\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563097 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-socket-dir-parent\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561483 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cnibin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560616 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-multus-certs\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560576 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563355 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-os-release\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.561903 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-k8s-cni-cncf-io\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563397 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-run-netns\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560240 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-host-var-lib-cni-bin\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563438 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-os-release\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.560206 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/01f1334f-a21c-4487-a1d6-dbecf7017c59-system-cni-dir\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.563508 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-system-cni-dir\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.565146 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.586241 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.598690 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.613328 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.623679 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.637062 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-cni-binary-copy\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.637072 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-multus-daemon-config\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.637755 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ebf2f253-531f-4835-84c1-928680352f7f-mcd-auth-proxy-config\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.638279 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.638311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/01f1334f-a21c-4487-a1d6-dbecf7017c59-cni-binary-copy\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.640238 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ebf2f253-531f-4835-84c1-928680352f7f-proxy-tls\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.642654 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h9c\" (UniqueName: \"kubernetes.io/projected/01f1334f-a21c-4487-a1d6-dbecf7017c59-kube-api-access-k9h9c\") pod \"multus-additional-cni-plugins-thbz4\" (UID: \"01f1334f-a21c-4487-a1d6-dbecf7017c59\") " pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.643024 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:13Z\\\",\\\"message\\\":\\\"W0202 10:32:02.975508 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:32:02.975920 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028322 cert, and key in /tmp/serving-cert-231171005/serving-signer.crt, /tmp/serving-cert-231171005/serving-signer.key\\\\nI0202 10:32:03.164665 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:32:03.171077 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:32:03.171332 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:03.176019 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-231171005/tls.crt::/tmp/serving-cert-231171005/tls.key\\\\\\\"\\\\nF0202 10:32:13.433273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.643321 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8c6\" (UniqueName: \"kubernetes.io/projected/ebf2f253-531f-4835-84c1-928680352f7f-kube-api-access-jc8c6\") pod \"machine-config-daemon-2wnn9\" (UID: \"ebf2f253-531f-4835-84c1-928680352f7f\") " pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.644572 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ptl6\" (UniqueName: \"kubernetes.io/projected/310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3-kube-api-access-4ptl6\") pod \"multus-kzwst\" (UID: \"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\") " pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.660615 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.671313 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-thbz4" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.682093 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.682126 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kzwst" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.683252 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 06:19:36.737907839 +0000 UTC Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.686419 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:32:21 crc kubenswrapper[4845]: W0202 10:32:21.697092 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310f06ec_b9c5_40c9_aeb9_a6e4ef5304c3.slice/crio-59b1c33b87539606b5c541081e96d98c017ec607104265a3ef4908039e16e5ed WatchSource:0}: Error finding container 59b1c33b87539606b5c541081e96d98c017ec607104265a3ef4908039e16e5ed: Status 404 returned error can't find the container with id 59b1c33b87539606b5c541081e96d98c017ec607104265a3ef4908039e16e5ed Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.700248 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: W0202 10:32:21.705600 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf2f253_531f_4835_84c1_928680352f7f.slice/crio-bc4a27138c54b706fb0fc9998b70b52b659090ec0b2c3097a533a45b406bbb7b WatchSource:0}: Error finding container bc4a27138c54b706fb0fc9998b70b52b659090ec0b2c3097a533a45b406bbb7b: Status 404 returned error can't find the container with id bc4a27138c54b706fb0fc9998b70b52b659090ec0b2c3097a533a45b406bbb7b Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.711900 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.712063 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.712527 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.712610 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.712687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.712756 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.720940 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.721670 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.723763 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.724740 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.726321 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.728180 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.729037 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.730346 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.731281 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.732561 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.733293 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.734311 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.735689 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.736785 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.741574 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.742402 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.744549 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.745173 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.746061 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.748623 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.749386 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.751281 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.751981 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.753260 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.753704 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.757013 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.757687 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.758183 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.759964 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.760504 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.762809 4845 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.762935 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.764762 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.765787 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.766292 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.767777 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.768504 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.769584 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.770276 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.771426 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.771943 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.775569 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.776706 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.777348 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.777827 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.779138 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.781146 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.781929 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.782413 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.783312 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.783769 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.784940 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.785703 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.786194 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.787708 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sh5vd"] Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.788603 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.791945 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.792245 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.792374 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.792545 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.802382 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.806351 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.806651 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.826403 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.840676 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.853131 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.866982 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867335 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867449 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867552 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867658 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867765 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.867972 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868089 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868183 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868275 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868427 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868523 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgfw\" (UniqueName: \"kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868622 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868704 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868785 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.868878 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.869023 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.869129 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.869221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.869311 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.884839 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerStarted","Data":"59b1c33b87539606b5c541081e96d98c017ec607104265a3ef4908039e16e5ed"} Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.885975 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerStarted","Data":"7026b81ced6a434941c75f5240aa0c7022340daff2657863bf36bd03319f3ebf"} Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.887616 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.889827 4845 scope.go:117] "RemoveContainer" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" Feb 02 10:32:21 crc kubenswrapper[4845]: E0202 10:32:21.890117 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.894940 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"bc4a27138c54b706fb0fc9998b70b52b659090ec0b2c3097a533a45b406bbb7b"} Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.896285 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.916081 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.929849 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74689dffec06ff54020fec1e21fdf780832e7ee03a7fed6c12395b985502870b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:13Z\\\",\\\"message\\\":\\\"W0202 10:32:02.975508 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:32:02.975920 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028322 cert, and key in /tmp/serving-cert-231171005/serving-signer.crt, /tmp/serving-cert-231171005/serving-signer.key\\\\nI0202 10:32:03.164665 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:32:03.171077 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:32:03.171332 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:03.176019 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-231171005/tls.crt::/tmp/serving-cert-231171005/tls.key\\\\\\\"\\\\nF0202 10:32:13.433273 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.943642 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.963874 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970718 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970764 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970782 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970800 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970829 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970845 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970862 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970912 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgfw\" (UniqueName: \"kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970960 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.970981 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971002 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971027 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971043 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971057 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971084 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971106 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971163 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971197 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971217 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971236 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971256 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971650 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971731 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971774 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.971849 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972116 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972238 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972281 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972308 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972332 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972354 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.972718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.975448 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.977847 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.987833 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgfw\" (UniqueName: \"kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw\") pod \"ovnkube-node-sh5vd\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.988173 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:21 crc kubenswrapper[4845]: I0202 10:32:21.999721 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:21Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.011963 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.023043 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.034188 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.048053 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.059651 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.074292 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.093943 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.115739 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.122415 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.134918 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: W0202 10:32:22.144634 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b93b041_3f3f_47ba_a9d4_d09de1b326dc.slice/crio-000add16087c8ea520a073a5f979801fd8148c67d21762dc28e5704d78c31411 WatchSource:0}: Error finding container 000add16087c8ea520a073a5f979801fd8148c67d21762dc28e5704d78c31411: Status 404 returned error can't find the container with id 000add16087c8ea520a073a5f979801fd8148c67d21762dc28e5704d78c31411 Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.158992 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.189393 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.203179 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.225360 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.684049 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:40:47.321187711 +0000 UTC Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.904398 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.906558 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.906622 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.908128 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerStarted","Data":"2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.910360 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c" exitCode=0 Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.910410 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.912614 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" exitCode=0 Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.912657 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.912685 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"000add16087c8ea520a073a5f979801fd8148c67d21762dc28e5704d78c31411"} Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.933463 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.950632 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.969897 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4845]: I0202 10:32:22.992153 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.031051 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.057398 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.070846 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.087153 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.098526 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.112635 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.130704 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.143557 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.161291 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.175104 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.185352 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.198694 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.214598 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.232229 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.244767 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.269052 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.283502 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.296549 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.313266 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.328295 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.390346 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.390429 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.390454 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.390477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.390505 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.390607 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.390655 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:27.390641907 +0000 UTC m=+28.482043357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.390986 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:27.390978337 +0000 UTC m=+28.482379787 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391021 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391043 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:27.391037228 +0000 UTC m=+28.482438678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391096 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391106 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391115 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391135 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:27.391129801 +0000 UTC m=+28.482531251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391170 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391178 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391185 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.391223 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:27.391216224 +0000 UTC m=+28.482617674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.518483 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.534158 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.537176 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.541237 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.551604 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.565672 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.576400 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.585680 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.597369 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.607950 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.622427 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.634845 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.651489 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.665629 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.682349 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.684839 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 09:00:59.631176905 +0000 UTC Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.699090 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xdtrh"] Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.699442 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.702510 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.702676 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.703338 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.703358 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.703699 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.711839 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.711857 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.711849 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.712006 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.712117 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:23 crc kubenswrapper[4845]: E0202 10:32:23.712180 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.713635 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.727567 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.738705 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.761902 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.775464 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.790011 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.793794 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-host\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.793987 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f24s\" (UniqueName: \"kubernetes.io/projected/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-kube-api-access-9f24s\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.794174 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-serviceca\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.806228 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.819054 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.833057 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.849870 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.893136 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.894566 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-host\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.894685 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f24s\" (UniqueName: \"kubernetes.io/projected/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-kube-api-access-9f24s\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.894681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-host\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.894803 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-serviceca\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.895689 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-serviceca\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.920751 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f" exitCode=0 Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.920848 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f"} Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.926326 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.926356 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.926369 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.926380 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.946811 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f24s\" (UniqueName: \"kubernetes.io/projected/86c7aea7-01dc-4f6d-ab41-94447a76fd6e-kube-api-access-9f24s\") pod \"node-ca-xdtrh\" (UID: \"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\") " pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.954338 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:23 crc kubenswrapper[4845]: I0202 10:32:23.995405 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:23Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.037261 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.072611 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.089442 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xdtrh" Feb 02 10:32:24 crc kubenswrapper[4845]: W0202 10:32:24.101279 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c7aea7_01dc_4f6d_ab41_94447a76fd6e.slice/crio-f7b5c48677ea9e06fac08d757561709c77b36cab742fc1a2c9b0109414152ea9 WatchSource:0}: Error finding container f7b5c48677ea9e06fac08d757561709c77b36cab742fc1a2c9b0109414152ea9: Status 404 returned error can't find the container with id f7b5c48677ea9e06fac08d757561709c77b36cab742fc1a2c9b0109414152ea9 Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.112088 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.151155 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.193108 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.230235 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.270719 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.308122 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.350107 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.389077 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.428851 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.470953 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.513612 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.685021 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 12:00:45.327706892 +0000 UTC Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.933823 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.933882 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.935437 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xdtrh" event={"ID":"86c7aea7-01dc-4f6d-ab41-94447a76fd6e","Type":"ContainerStarted","Data":"520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af"} Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.935497 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xdtrh" event={"ID":"86c7aea7-01dc-4f6d-ab41-94447a76fd6e","Type":"ContainerStarted","Data":"f7b5c48677ea9e06fac08d757561709c77b36cab742fc1a2c9b0109414152ea9"} Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.937923 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42" exitCode=0 Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.938000 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42"} Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.952089 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.968312 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:24 crc kubenswrapper[4845]: I0202 10:32:24.984011 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.001252 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:24Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.014612 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.034864 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.046760 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.058655 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.074854 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.086729 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.099060 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.113574 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.126278 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.138898 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.150242 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.160752 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.189601 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.227660 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.285471 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.315547 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.351627 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.402450 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.444599 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.477313 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.512470 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.554435 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.597815 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.637609 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.685616 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 12:31:30.973017089 +0000 UTC Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.712322 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.712355 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.712484 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:25 crc kubenswrapper[4845]: E0202 10:32:25.712584 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:25 crc kubenswrapper[4845]: E0202 10:32:25.712754 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:25 crc kubenswrapper[4845]: E0202 10:32:25.712919 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.925238 4845 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.928394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.928451 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.928464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.928591 4845 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.937804 4845 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.938209 4845 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.939623 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.939704 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.939752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.939805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.939827 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.944588 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0"} Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.944506 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0" exitCode=0 Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.968904 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: E0202 10:32:25.969129 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.973426 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.973482 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.973501 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.973525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.973543 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4845]: I0202 10:32:25.984533 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4845]: E0202 10:32:25.991952 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.001107 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.003436 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.003537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.003557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.003581 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.003611 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.013056 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: E0202 10:32:26.019308 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.023675 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.023712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.023721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.023735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.023745 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.028997 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: E0202 10:32:26.034816 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.038409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.038450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.038468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.038491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.038504 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.047679 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: E0202 10:32:26.050651 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: E0202 10:32:26.050758 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.056986 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.057018 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.057027 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.057040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.057049 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.067128 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.081618 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.096138 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162334 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162355 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162365 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.162829 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.175543 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.187173 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.202713 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.230386 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.264210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.264247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.264258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.264274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.264285 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.366857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.366931 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.366945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.366962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.366974 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.469500 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.469542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.469554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.469568 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.469581 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.572807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.572917 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.573071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.573104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.573126 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.604005 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.608785 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.617307 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.619501 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.630662 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.645862 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.667795 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.675969 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.676018 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.676031 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.676056 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.676070 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.680526 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.686349 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 03:12:55.948069757 +0000 UTC Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.690209 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.699649 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.708575 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.719996 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.752110 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.771733 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.778606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.778654 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.778664 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.778679 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.778689 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.788130 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.800173 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.812737 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.852165 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.880834 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.880871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.880954 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.880987 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.881000 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.891762 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.932950 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.952078 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6" exitCode=0 Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.952163 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.959968 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.972706 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.987120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.987161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.987170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.987186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4845]: I0202 10:32:26.987196 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4845]: E0202 10:32:26.988054 4845 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.029423 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.072219 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.090191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.090234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.090250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.090271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.090286 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.111387 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.155061 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.194006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.194057 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.194070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.194088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.194100 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.195129 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.233997 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.271337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.297191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.297224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.297236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.297254 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.297268 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.314500 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.349794 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.396584 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.400956 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.401226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.401388 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.401691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.401874 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.434207 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.471444 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.474972 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.475093 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.475133 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.475164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.475189 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475265 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475322 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.475304226 +0000 UTC m=+36.566705696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475663 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.475649086 +0000 UTC m=+36.567050546 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475754 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475777 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475792 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475825 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.475816131 +0000 UTC m=+36.567217591 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475879 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475913 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475923 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.475950 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.475941914 +0000 UTC m=+36.567343384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.476000 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.476028 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.476020646 +0000 UTC m=+36.567422106 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.504521 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.504551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.504561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.504576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.504587 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.515031 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.550619 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.592705 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.607241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.607310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.607333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.607358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.607377 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.631506 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.676377 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.687496 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 02:41:19.240909339 +0000 UTC Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.710442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.710505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.710525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.710552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.710569 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.711977 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.712060 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.712173 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.712163 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.712340 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:27 crc kubenswrapper[4845]: E0202 10:32:27.712531 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.716293 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.753166 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815117 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815130 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.815337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.840412 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.877286 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.911706 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.917397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.917432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.917444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.917460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.917472 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.951749 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.971482 4845 generic.go:334] "Generic (PLEG): container finished" podID="01f1334f-a21c-4487-a1d6-dbecf7017c59" containerID="0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233" exitCode=0 Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.971552 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerDied","Data":"0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233"} Feb 02 10:32:27 crc kubenswrapper[4845]: I0202 10:32:27.989794 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:27Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.019970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.020005 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.020014 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.020026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.020035 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.029513 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.072059 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.114835 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.122110 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.122139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.122149 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.122164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.122176 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.159450 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.195010 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.225176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.225210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.225225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.225247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.225263 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.236173 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.276744 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.311439 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.327415 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.327447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.327457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.327471 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.327481 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.351664 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.392785 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.430660 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.430724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.430735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.430761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.430803 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.432218 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.472429 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.514300 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.532855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.532945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.532959 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.532979 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.532992 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.555060 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.593835 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.629595 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.637031 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.637083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.637102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.637125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.637145 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.687759 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 15:35:41.812495383 +0000 UTC Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.739663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.739701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.739719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.739742 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.739760 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.810729 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.811770 4845 scope.go:117] "RemoveContainer" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" Feb 02 10:32:28 crc kubenswrapper[4845]: E0202 10:32:28.812050 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.843696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.843747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.843761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.843784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.843798 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.946838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.946913 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.946927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.946956 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.946972 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.980415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf"} Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.980748 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.980861 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:28 crc kubenswrapper[4845]: I0202 10:32:28.986818 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" event={"ID":"01f1334f-a21c-4487-a1d6-dbecf7017c59","Type":"ContainerStarted","Data":"a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.007245 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.017145 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.017213 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.025271 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.038965 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.053863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.053956 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.053974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.053999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.054019 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.070858 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.091414 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.105930 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.119869 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.133611 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.147542 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.157542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.157580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.157591 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.157606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.157618 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.164234 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.177757 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.189357 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.206607 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.217609 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.226146 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.260420 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.260803 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.260898 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.261006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.261124 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.273509 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.310585 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.355152 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.368264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.368305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.368315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.368333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.368345 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.387324 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.429434 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.453243 4845 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.470245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.470310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.470330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.470362 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.470382 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.472656 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.513176 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.551257 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.573189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.573240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.573252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.573268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.573279 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.600263 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.639807 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.676492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.676572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.676595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.676624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.676643 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.688678 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 23:22:51.184455895 +0000 UTC Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.711620 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.711645 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.711802 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:29 crc kubenswrapper[4845]: E0202 10:32:29.711790 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:29 crc kubenswrapper[4845]: E0202 10:32:29.712023 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:29 crc kubenswrapper[4845]: E0202 10:32:29.712211 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.777502 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.781654 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.781703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.781715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.781731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.781742 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.806047 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.823024 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.841099 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.873267 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.887347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.887385 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.887396 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.887409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.887420 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.898431 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.919468 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.954752 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.990109 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.990151 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.990163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.990180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.990220 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4845]: I0202 10:32:29.991452 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.012089 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.036302 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.078567 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.093339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.093411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.093429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.093455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.093483 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.117176 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.156093 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.196826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.196920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.196935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.196957 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.196970 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.197108 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.237473 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.270535 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.300137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.300193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.300205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.300224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.300236 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.308816 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.356617 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.392417 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.403530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.403582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.403600 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.403626 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.403645 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.428437 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.506435 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.506496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.506516 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.506539 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.506555 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.613342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.613397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.613415 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.613444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.613461 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.689289 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 12:05:58.974251374 +0000 UTC Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.716591 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.716636 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.716652 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.716675 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.716692 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.819318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.819360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.819372 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.819530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.819541 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.922787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.922828 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.922839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.922855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4845]: I0202 10:32:30.922866 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.015017 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.025767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.025802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.025815 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.025832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.025845 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.128827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.128876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.128904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.128922 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.128933 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.231612 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.231678 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.231688 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.231702 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.231712 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.335071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.335126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.335142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.335163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.335174 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.438380 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.438435 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.438451 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.438472 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.438486 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.541367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.541825 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.541839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.541858 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.541896 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.645034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.645096 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.645105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.645119 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.645143 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.690387 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 09:31:16.791410722 +0000 UTC Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.712262 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.712271 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.712456 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:31 crc kubenswrapper[4845]: E0202 10:32:31.712631 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:31 crc kubenswrapper[4845]: E0202 10:32:31.712769 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:31 crc kubenswrapper[4845]: E0202 10:32:31.712929 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.748394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.748436 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.748445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.748459 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.748471 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.851418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.851487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.851498 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.851540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.851556 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.954463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.954524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.954547 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.954571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4845]: I0202 10:32:31.954589 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.022420 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/0.log" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.027400 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf" exitCode=1 Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.027453 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.028527 4845 scope.go:117] "RemoveContainer" containerID="81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.053034 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.057164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.057345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.057385 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.057550 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.057590 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.076406 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.099701 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.118410 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.135154 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.153789 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.160143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.160171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.160183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.160198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.160207 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.172398 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.184645 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.198219 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.219145 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.245217 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.258019 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.261955 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.261988 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.261997 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.262011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.262020 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.269720 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.284933 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.297085 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.366286 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.366373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.366387 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.366432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.366444 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.469216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.469257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.469268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.469283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.469293 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.571931 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.571971 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.571983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.571999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.572010 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.674248 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.674304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.674317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.674331 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.674340 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.691505 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 04:30:24.33298604 +0000 UTC Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.776994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.777027 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.777035 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.777061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.777071 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.879497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.879531 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.879556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.879570 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.879579 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.982164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.982194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.982203 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.982216 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4845]: I0202 10:32:32.982224 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.032092 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/1.log" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.032736 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/0.log" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.035729 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d" exitCode=1 Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.035789 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.035846 4845 scope.go:117] "RemoveContainer" containerID="81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.036742 4845 scope.go:117] "RemoveContainer" containerID="85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d" Feb 02 10:32:33 crc kubenswrapper[4845]: E0202 10:32:33.036975 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.058614 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.074868 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.085009 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.085052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.085069 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.085091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.085107 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.089509 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.101109 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.115764 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.125968 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.142218 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.157283 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk"] Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.157795 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.159865 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.159942 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.159949 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.180171 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.188172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.188311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.188326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.188341 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.188356 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.197492 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.212581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.237498 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.250635 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.263932 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.277337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291593 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291634 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.291731 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.305213 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.320639 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.335379 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.335464 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdnv\" (UniqueName: \"kubernetes.io/projected/67f40dda-fb6a-490a-86a1-a14d6b183c8b-kube-api-access-zvdnv\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.335500 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.335517 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.347165 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.357492 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.370021 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.383155 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.394371 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.394428 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.394443 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.394462 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.394475 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.395847 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.410556 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.434297 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.437491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdnv\" (UniqueName: \"kubernetes.io/projected/67f40dda-fb6a-490a-86a1-a14d6b183c8b-kube-api-access-zvdnv\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.437783 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.437817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.437858 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.438487 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.438980 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.447167 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.452821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/67f40dda-fb6a-490a-86a1-a14d6b183c8b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.458282 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdnv\" (UniqueName: \"kubernetes.io/projected/67f40dda-fb6a-490a-86a1-a14d6b183c8b-kube-api-access-zvdnv\") pod \"ovnkube-control-plane-749d76644c-c2rmk\" (UID: \"67f40dda-fb6a-490a-86a1-a14d6b183c8b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.462597 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.475559 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.482157 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: W0202 10:32:33.490983 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f40dda_fb6a_490a_86a1_a14d6b183c8b.slice/crio-f0afaa9dd829ee2f0aa60a57b5c121c06decee396a0f4903ec001af7378c525f WatchSource:0}: Error finding container f0afaa9dd829ee2f0aa60a57b5c121c06decee396a0f4903ec001af7378c525f: Status 404 returned error can't find the container with id f0afaa9dd829ee2f0aa60a57b5c121c06decee396a0f4903ec001af7378c525f Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.496557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.496603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.496619 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.496642 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.496659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.499987 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.525770 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.543589 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.598919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.598970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.598981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.598998 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.599011 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.692183 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:17:00.055219903 +0000 UTC Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.701425 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.701459 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.701471 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.701484 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.701496 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.712087 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:33 crc kubenswrapper[4845]: E0202 10:32:33.712255 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.712492 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.712597 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:33 crc kubenswrapper[4845]: E0202 10:32:33.712673 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:33 crc kubenswrapper[4845]: E0202 10:32:33.712622 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.805144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.805218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.805241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.805265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.805284 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.909059 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.909124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.909142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.909165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4845]: I0202 10:32:33.909182 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.013023 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.013080 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.013098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.013123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.013140 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.043342 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/1.log" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.049097 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" event={"ID":"67f40dda-fb6a-490a-86a1-a14d6b183c8b","Type":"ContainerStarted","Data":"f0afaa9dd829ee2f0aa60a57b5c121c06decee396a0f4903ec001af7378c525f"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.123575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.123628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.123641 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.123657 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.123670 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.227015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.227368 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.227381 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.227399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.227410 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.330817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.330872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.330940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.330968 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.330984 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.433587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.433646 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.433658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.433676 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.433688 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.535773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.535813 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.535829 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.535849 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.535866 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.638700 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pmn9h"] Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639581 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: E0202 10:32:34.639689 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639824 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639851 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.639874 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.664139 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.681774 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.692779 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:55:51.802521599 +0000 UTC Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.696565 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.712189 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.733748 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.742701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.742755 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.742775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.742799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.742816 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.749426 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.757795 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.757866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5d9c\" (UniqueName: \"kubernetes.io/projected/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-kube-api-access-v5d9c\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.766163 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.789031 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.817053 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.832949 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.845939 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.846013 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.846036 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.846065 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.846094 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.850939 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.858945 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5d9c\" (UniqueName: \"kubernetes.io/projected/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-kube-api-access-v5d9c\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.859236 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: E0202 10:32:34.859445 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:34 crc kubenswrapper[4845]: E0202 10:32:34.859566 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:35.359535398 +0000 UTC m=+36.450936878 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.867246 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.879351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5d9c\" (UniqueName: \"kubernetes.io/projected/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-kube-api-access-v5d9c\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.884819 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.900826 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.914432 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.931753 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.942661 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.948242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.948451 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.948555 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.948671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4845]: I0202 10:32:34.948835 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.051362 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.051401 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.051434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.051455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.051468 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.055180 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" event={"ID":"67f40dda-fb6a-490a-86a1-a14d6b183c8b","Type":"ContainerStarted","Data":"9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.055230 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" event={"ID":"67f40dda-fb6a-490a-86a1-a14d6b183c8b","Type":"ContainerStarted","Data":"286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.069686 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.081526 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.092076 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.106056 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.117875 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.133794 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.153801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.153855 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.153869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.153913 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.153930 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.155337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.182246 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.201683 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.229835 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.247186 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.256092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.256124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.256135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.256152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.256162 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.260451 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.271726 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.283426 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.296481 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.305941 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.316428 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.358673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.358708 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.358718 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.358732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.358741 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.364235 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.364355 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.364411 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:36.364398532 +0000 UTC m=+37.455799982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.460635 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.460715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.460737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.460767 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.460788 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.564399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.564465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.564483 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.564508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.564525 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.566096 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.566263 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566286 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:51.566261724 +0000 UTC m=+52.657663294 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.566313 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.566362 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.566417 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566430 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566464 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566481 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:51.56647136 +0000 UTC m=+52.657872990 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566541 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:51.566526622 +0000 UTC m=+52.657928252 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566638 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566667 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566700 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566715 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566772 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:51.566750668 +0000 UTC m=+52.658152118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566672 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566800 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.566856 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:51.566837411 +0000 UTC m=+52.658238861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.668430 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.668486 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.668499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.668518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.668533 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.693750 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 22:14:57.844607176 +0000 UTC Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.712091 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.712161 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.712228 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.712244 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.712367 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:35 crc kubenswrapper[4845]: E0202 10:32:35.712585 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.771196 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.771242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.771251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.771266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.771275 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.874000 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.874059 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.874071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.874108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.874122 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.976360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.976393 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.976402 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.976414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4845]: I0202 10:32:35.976443 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.079273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.079347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.079367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.079389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.079403 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.182264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.182350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.182374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.182403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.182424 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.284783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.284873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.284908 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.284932 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.284949 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.375249 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.375401 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.375492 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.375470463 +0000 UTC m=+39.466871913 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.387052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.387120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.387136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.387159 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.387176 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.439300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.439341 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.439352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.439367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.439378 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.453158 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.457698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.457731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.457741 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.457757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.457768 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.471058 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.477659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.477697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.477709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.477724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.477737 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.490521 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.494710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.494752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.494763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.494777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.494787 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.507284 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.511682 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.511726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.511737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.511754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.511769 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.525121 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.525283 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.526927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.526962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.526973 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.526989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.527001 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.628796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.628840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.628851 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.628867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.628877 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.694403 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:09:29.015186306 +0000 UTC Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.711756 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:36 crc kubenswrapper[4845]: E0202 10:32:36.711947 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.731400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.731447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.731458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.731473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.731483 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.833833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.833869 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.833878 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.833918 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.833936 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.935985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.936032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.936083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.936108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4845]: I0202 10:32:36.936130 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.038444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.038528 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.038556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.038580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.038598 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.141867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.141934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.141944 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.141960 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.141970 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.245257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.245302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.245313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.245330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.245341 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.348047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.348079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.348088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.348106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.348119 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.450669 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.450711 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.450726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.450745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.450763 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.553654 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.553689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.553699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.553713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.553724 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.657189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.657247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.657261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.657280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.657294 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.695183 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 17:09:00.700659985 +0000 UTC Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.712554 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.712653 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.712735 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:37 crc kubenswrapper[4845]: E0202 10:32:37.712723 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:37 crc kubenswrapper[4845]: E0202 10:32:37.712878 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:37 crc kubenswrapper[4845]: E0202 10:32:37.713014 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.759534 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.759586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.759601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.759622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.759635 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.863242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.863274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.863282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.863294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.863303 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.966366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.966452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.966473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.966498 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4845]: I0202 10:32:37.966517 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.069105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.069177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.069195 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.069219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.069236 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.172683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.172766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.172790 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.172818 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.172839 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.284445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.284507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.284519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.284536 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.284550 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.386841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.386923 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.386941 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.386962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.386976 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.395914 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:38 crc kubenswrapper[4845]: E0202 10:32:38.396173 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:38 crc kubenswrapper[4845]: E0202 10:32:38.396287 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:42.396253563 +0000 UTC m=+43.487655053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.490107 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.490144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.490156 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.490172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.490183 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.593293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.593349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.593367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.593389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.593406 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.695705 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 17:10:11.846007774 +0000 UTC Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.696198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.696253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.696305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.696331 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.696347 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.711724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:38 crc kubenswrapper[4845]: E0202 10:32:38.711843 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.799541 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.799597 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.799613 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.799633 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.799650 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.902160 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.902198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.902209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.902225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4845]: I0202 10:32:38.902235 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.004714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.004765 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.004784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.004801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.004814 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.107081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.107138 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.107154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.107178 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.107195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.210871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.210938 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.210953 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.210973 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.210985 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.313988 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.314025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.314037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.314062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.314075 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.417013 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.417055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.417064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.417079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.417091 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.520165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.520211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.520226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.520245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.520258 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.622270 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.622328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.622346 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.622373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.622391 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.696068 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:25:45.786647213 +0000 UTC Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.712094 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.712145 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:39 crc kubenswrapper[4845]: E0202 10:32:39.712421 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:39 crc kubenswrapper[4845]: E0202 10:32:39.712558 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.713005 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:39 crc kubenswrapper[4845]: E0202 10:32:39.713117 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.725545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.725646 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.725667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.725731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.725764 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.727873 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.741987 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.756050 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.770111 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.786036 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.804466 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.817757 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.827374 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.828622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.828669 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.828681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.828697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.828708 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.842295 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.854749 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.865391 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.875336 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.885015 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.896385 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.910011 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.928332 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.931096 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.931144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.931161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.931182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.931196 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4845]: I0202 10:32:39.956743 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.033519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.033572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.033589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.033610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.033626 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.136596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.136659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.136682 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.136712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.136733 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.239681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.239818 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.239842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.239872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.239927 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.343199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.343272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.343297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.343326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.343347 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.446785 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.446859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.446928 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.447022 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.447052 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.550620 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.550693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.550717 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.550745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.550769 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.652837 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.652983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.653020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.653048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.653069 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.696384 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:01:46.230312422 +0000 UTC Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.712468 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:40 crc kubenswrapper[4845]: E0202 10:32:40.712602 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.755556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.755583 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.755592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.755607 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.755616 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.858723 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.858784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.858800 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.858823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.858841 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.961503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.961690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.961724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.961757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4845]: I0202 10:32:40.961777 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.065038 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.065103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.065115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.065133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.065146 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.168673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.168727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.168737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.168752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.168762 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.272681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.272743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.272760 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.272991 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.274382 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.378116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.378182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.378200 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.378221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.378238 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.480670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.480712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.480721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.480733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.480742 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.583776 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.583828 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.583845 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.583859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.583875 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.686983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.687036 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.687047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.687065 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.687076 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.696690 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 01:17:45.246058537 +0000 UTC Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.712183 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.712213 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.712211 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:41 crc kubenswrapper[4845]: E0202 10:32:41.712340 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:41 crc kubenswrapper[4845]: E0202 10:32:41.712494 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:41 crc kubenswrapper[4845]: E0202 10:32:41.712660 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.793537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.794754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.794772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.794787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.794797 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.897619 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.897668 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.897680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.897697 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4845]: I0202 10:32:41.897709 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.001789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.001923 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.001936 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.001955 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.001967 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.105373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.105419 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.105429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.105445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.105458 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.207757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.208092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.208105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.208122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.208133 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.311013 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.311076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.311094 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.311120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.311138 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.413657 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.413739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.413763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.413798 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.413821 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.445881 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:42 crc kubenswrapper[4845]: E0202 10:32:42.446169 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:42 crc kubenswrapper[4845]: E0202 10:32:42.446310 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:50.446277519 +0000 UTC m=+51.537678999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.517105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.517184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.517209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.517239 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.517332 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.619794 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.619835 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.619843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.619856 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.619905 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.697615 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 00:12:38.640367769 +0000 UTC Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.711858 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.712373 4845 scope.go:117] "RemoveContainer" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" Feb 02 10:32:42 crc kubenswrapper[4845]: E0202 10:32:42.712716 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.724048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.724079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.724089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.724121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.724131 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.826394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.826435 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.826453 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.826473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.826485 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.929220 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.929273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.929284 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.929304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4845]: I0202 10:32:42.929319 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.032265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.032304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.032315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.032330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.032341 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.087379 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.089011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.089461 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.103338 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.114227 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.123800 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.134722 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.135097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.135136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.135148 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.135165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.135175 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.148410 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.161803 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.174516 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.193051 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b434ba96232b8fa4880bdd4beca2db5c6a159643e207586ab3b6401ef09fdf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:31Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:31.598437 6245 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:31.598495 6245 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:31.598528 6245 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:32:31.598541 6245 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:31.598559 6245 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:31.598581 6245 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:31.598586 6245 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:31.598588 6245 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0202 10:32:31.598604 6245 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:31.598627 6245 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:31.598660 6245 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:31.598674 6245 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:31.598726 6245 factory.go:656] Stopping watch factory\\\\nI0202 10:32:31.598752 6245 ovnkube.go:599] Stopped ovnkube\\\\nI0202 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.210646 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.224260 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.237418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.237465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.237477 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.237493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.237504 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.238684 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.252584 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.269056 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.280293 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.290581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.303279 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.312352 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.339714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.339752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.339761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.339775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.339784 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.442666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.442725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.442743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.442768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.442785 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.545733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.545777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.545791 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.545808 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.545817 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.647995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.648043 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.648053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.648069 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.648109 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.698832 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 12:06:06.257532798 +0000 UTC Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.712214 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.712248 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:43 crc kubenswrapper[4845]: E0202 10:32:43.712344 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.712359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:43 crc kubenswrapper[4845]: E0202 10:32:43.712473 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:43 crc kubenswrapper[4845]: E0202 10:32:43.712615 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.749978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.750014 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.750023 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.750036 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.750045 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.853281 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.853328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.853339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.853357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.853369 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.955840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.955873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.955904 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.955920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4845]: I0202 10:32:43.955930 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.058266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.058311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.058320 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.058335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.058345 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.161096 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.161143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.161155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.161172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.161184 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.264129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.264213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.264234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.264262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.264286 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.368075 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.368135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.368150 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.368172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.368187 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.471507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.471563 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.471580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.471603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.471621 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.575354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.575416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.575426 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.575468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.575480 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.678656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.678703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.678716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.678735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.678750 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.699429 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 08:00:26.023101947 +0000 UTC Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.711863 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:44 crc kubenswrapper[4845]: E0202 10:32:44.712035 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.781854 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.781929 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.781945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.781959 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.781970 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.885139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.885182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.885193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.885211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.885224 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.988568 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.988621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.988639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.988666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4845]: I0202 10:32:44.988683 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.092049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.092135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.092154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.092178 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.092197 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.194972 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.195024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.195041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.195064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.195081 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.298398 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.298464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.298483 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.298510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.298530 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.402105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.402169 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.402187 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.402214 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.402235 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.505747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.505808 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.505835 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.505863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.505925 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.608344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.608381 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.608392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.608409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.608422 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.699938 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 18:15:29.626326592 +0000 UTC Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711604 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711647 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711722 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711791 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.711808 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: E0202 10:32:45.712023 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.712098 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:45 crc kubenswrapper[4845]: E0202 10:32:45.712234 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:45 crc kubenswrapper[4845]: E0202 10:32:45.712369 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.814462 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.814533 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.814552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.814577 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.814599 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.917397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.917463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.917485 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.917513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4845]: I0202 10:32:45.917534 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.020468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.020503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.020513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.020525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.020535 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.123188 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.123259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.123287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.123317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.123339 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.227380 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.227464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.227487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.227513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.227533 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.330181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.330221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.330231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.330244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.330255 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.433011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.433086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.433100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.433121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.433135 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.536872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.536958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.536970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.536988 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.537003 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.639533 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.639571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.639582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.639596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.639608 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.700533 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 15:38:14.221496548 +0000 UTC Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.711790 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.711948 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.741754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.741797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.741806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.741823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.741834 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.756006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.756042 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.756052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.756106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.756116 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.781276 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:46Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.790739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.790809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.790824 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.790871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.790914 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.813460 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:46Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.816519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.816540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.816548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.816572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.816581 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.831608 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:46Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.835391 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.835423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.835432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.835446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.835454 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.846076 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:46Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.849450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.849479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.849490 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.849505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.849514 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.860526 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:46Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:46 crc kubenswrapper[4845]: E0202 10:32:46.860653 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.862407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.862444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.862455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.862472 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.862484 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.964931 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.964972 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.964982 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.964996 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4845]: I0202 10:32:46.965008 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.067181 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.067222 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.067234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.067246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.067255 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.169161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.169208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.169221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.169236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.169246 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.271537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.271585 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.271602 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.271622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.271637 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.374323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.374373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.374384 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.374406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.374436 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.476367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.476399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.476407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.476419 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.476427 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.579149 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.579187 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.579199 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.579214 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.579223 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.681683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.681725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.681734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.681748 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.681757 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.700961 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 15:00:21.197929818 +0000 UTC Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.712525 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.712578 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.712538 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:47 crc kubenswrapper[4845]: E0202 10:32:47.712948 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:47 crc kubenswrapper[4845]: E0202 10:32:47.713025 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:47 crc kubenswrapper[4845]: E0202 10:32:47.713091 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.713114 4845 scope.go:117] "RemoveContainer" containerID="85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.725063 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.738139 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.746327 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.756506 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.766120 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.774930 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.783468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.783574 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.783600 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.783712 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.783815 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.785111 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.794751 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.803630 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.817190 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.828383 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.836492 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.855532 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.868452 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.879994 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.886635 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.886667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.886678 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.886696 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.886707 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.894777 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.906615 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.917096 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.993189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.993236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.993246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.993260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4845]: I0202 10:32:47.993270 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.096672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.096739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.096758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.096781 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.096798 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.109043 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/1.log" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.113007 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.113565 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.136092 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.149267 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.173091 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.187377 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.199217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.199258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.199271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.199293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.199307 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.207623 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.228191 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.246338 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.258718 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.269010 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.280870 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.299701 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.301411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.301452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.301463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.301480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.301491 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.314698 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.328202 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.350614 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.369821 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.383240 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.397434 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.404009 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.404041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.404052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.404067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.404077 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.506732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.506769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.506779 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.506793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.506803 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.609427 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.609476 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.609488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.609509 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.609522 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.701528 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 23:16:56.392135491 +0000 UTC Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.711509 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:48 crc kubenswrapper[4845]: E0202 10:32:48.711637 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.712505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.712540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.712552 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.712563 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.712572 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.815662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.815778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.815809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.815838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.815862 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.919496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.919628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.919665 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.919693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4845]: I0202 10:32:48.919717 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.022138 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.022202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.022219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.022242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.022259 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.120404 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/2.log" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.121503 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/1.log" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.124609 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.124661 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.124677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.124700 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.124717 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.125369 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" exitCode=1 Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.125442 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.125555 4845 scope.go:117] "RemoveContainer" containerID="85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.128151 4845 scope.go:117] "RemoveContainer" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" Feb 02 10:32:49 crc kubenswrapper[4845]: E0202 10:32:49.128501 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.153767 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.167600 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.182056 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.204060 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.215979 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.229411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.229554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.229582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.229699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.229773 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.231064 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.246593 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.261967 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.278567 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.295144 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.320918 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.332937 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.333008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.333033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.333062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.333085 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.347615 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.368673 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.399317 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.419208 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.436118 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.436194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.436213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.436236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.436253 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.439124 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.462350 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.539957 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.540052 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.540064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.540086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.540102 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.643422 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.643503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.643526 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.643571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.643610 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.702190 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 08:07:23.312234367 +0000 UTC Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.711720 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.711741 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.711991 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:49 crc kubenswrapper[4845]: E0202 10:32:49.712049 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:49 crc kubenswrapper[4845]: E0202 10:32:49.711858 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:49 crc kubenswrapper[4845]: E0202 10:32:49.712220 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.738731 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.746725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.746981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.746995 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.747011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.747021 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.752843 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.763506 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.779808 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://85af18ff654e362360063055e913234601da73cd85fa7c0c61b518cd1c420d3d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\" 6365 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nI0202 10:32:32.819174 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-rzb6b\\\\nI0202 10:32:32.819204 6365 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-thbz4\\\\nF0202 10:32:32.819209 6365 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z]\\\\nI0202 10:32:32.819214 6365 obj_retry.g\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.792367 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.807707 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.819058 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.831584 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.839531 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850015 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850656 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850704 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.850727 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.862086 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.872524 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.882783 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.893124 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.902745 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.913312 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.923108 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.952480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.952530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.952543 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.952561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4845]: I0202 10:32:49.952574 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.055088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.055132 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.055143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.055158 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.055169 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.130135 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/2.log" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.134447 4845 scope.go:117] "RemoveContainer" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" Feb 02 10:32:50 crc kubenswrapper[4845]: E0202 10:32:50.134817 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.150376 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.157378 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.157422 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.157438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.157483 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.157523 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.165699 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.181607 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.197364 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.213346 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.227311 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.241253 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.257446 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.261582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.261643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.261658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.261683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.261695 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.271088 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.283449 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.295851 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.305239 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.317375 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.348180 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.364431 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.364507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.364531 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.364564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.364588 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.379741 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.399064 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.415413 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.467113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.467159 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.467170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.467186 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.467198 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.547400 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:50 crc kubenswrapper[4845]: E0202 10:32:50.547567 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:50 crc kubenswrapper[4845]: E0202 10:32:50.547624 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:06.547607892 +0000 UTC m=+67.639009352 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.569360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.569423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.569440 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.569465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.569482 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.673191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.673259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.673284 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.673315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.673341 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.703004 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 19:21:27.916397137 +0000 UTC Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.712587 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:50 crc kubenswrapper[4845]: E0202 10:32:50.712842 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.776232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.776680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.776707 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.776736 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.776758 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.879680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.879750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.879769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.879795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.879812 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.985993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.986063 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.986078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.986103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4845]: I0202 10:32:50.986121 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.089144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.089205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.089225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.089249 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.089267 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.191671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.191750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.191773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.191805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.191828 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.294733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.294793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.294809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.294832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.294846 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.397308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.397371 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.397394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.397422 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.397443 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.500459 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.500542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.500564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.500597 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.500620 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.603993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.604046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.604058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.604075 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.604089 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.659597 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.659774 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.659801 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:23.659774733 +0000 UTC m=+84.751176183 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.659852 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.659923 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.659945 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660015 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:23.659980499 +0000 UTC m=+84.751381979 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.660048 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660078 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660078 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660094 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660101 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660106 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660112 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660139 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:23.660132004 +0000 UTC m=+84.751533454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660150 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:23.660145004 +0000 UTC m=+84.751546454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660177 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.660223 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:23.660210126 +0000 UTC m=+84.751611606 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.703545 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:18:55.512137756 +0000 UTC Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.706834 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.706920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.706946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.706972 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.706990 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.712109 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.712191 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.712205 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.712315 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.712398 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:51 crc kubenswrapper[4845]: E0202 10:32:51.712483 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.810314 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.810364 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.810380 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.810403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.810419 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.913575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.913611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.913621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.913636 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4845]: I0202 10:32:51.913647 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.016383 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.016449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.016467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.016493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.016510 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.119576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.119625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.119638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.119655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.119668 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.222168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.222205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.222217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.222232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.222243 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.324687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.324754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.324773 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.324798 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.324817 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.429561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.429623 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.429645 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.429670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.429687 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.532280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.532324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.532336 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.532352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.532365 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.636103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.636176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.636200 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.636238 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.636262 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.704098 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 07:49:25.02519245 +0000 UTC Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.712554 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:52 crc kubenswrapper[4845]: E0202 10:32:52.712756 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.739756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.739836 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.739859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.739927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.739956 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.842442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.842511 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.842533 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.842557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.842574 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.945616 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.945688 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.945710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.945741 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4845]: I0202 10:32:52.945764 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.009070 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.026004 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.033479 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.051170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.051242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.051266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.051296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.051318 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.054975 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.076224 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.096164 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.113854 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.130318 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.157643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.157721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.157744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.157769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.157790 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.164423 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.188337 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.208801 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.238979 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.257068 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.260589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.260625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.260645 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.260659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.260670 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.273976 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.292747 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.309829 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.321113 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.338307 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.348521 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.363544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.363649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.363671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.363698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.363717 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.467272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.467353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.467377 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.467405 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.467428 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.569752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.569811 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.569831 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.569854 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.569871 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.673461 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.673584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.673610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.673638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.673659 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.705240 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 18:24:20.432117415 +0000 UTC Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.712694 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.712768 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:53 crc kubenswrapper[4845]: E0202 10:32:53.712845 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.712859 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:53 crc kubenswrapper[4845]: E0202 10:32:53.713035 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:53 crc kubenswrapper[4845]: E0202 10:32:53.713117 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.776083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.776163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.776187 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.776217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.776239 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.879323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.879395 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.879545 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.879581 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.879604 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.983141 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.983209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.983232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.983259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4845]: I0202 10:32:53.983279 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.086488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.086546 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.086563 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.086589 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.086605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.188940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.188986 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.189000 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.189019 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.189033 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.292129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.292198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.292235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.292274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.292299 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.395615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.395672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.395690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.395713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.395732 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.498452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.498510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.498530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.498556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.498573 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.600774 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.600848 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.600866 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.600925 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.600943 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.704173 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.704262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.704287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.704315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.704335 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.706406 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:30:33.883768755 +0000 UTC Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.711775 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:54 crc kubenswrapper[4845]: E0202 10:32:54.712014 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.808379 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.808481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.808509 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.808546 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.808594 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.911342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.911404 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.911421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.911446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4845]: I0202 10:32:54.911463 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.015044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.015098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.015115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.015137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.015153 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.117828 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.117910 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.117928 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.117950 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.117965 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.221300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.221370 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.221388 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.221413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.221435 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.324008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.324083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.324095 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.324112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.324125 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.427409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.427488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.427508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.427537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.427562 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.530820 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.530939 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.530962 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.531026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.531045 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.633710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.633762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.633776 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.633794 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.633808 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.706833 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 00:20:47.078988543 +0000 UTC Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.712172 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.712249 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:55 crc kubenswrapper[4845]: E0202 10:32:55.712315 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:55 crc kubenswrapper[4845]: E0202 10:32:55.712439 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.712490 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:55 crc kubenswrapper[4845]: E0202 10:32:55.712635 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.736698 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.736760 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.736777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.736798 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.736815 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.840255 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.840326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.840348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.840376 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.840393 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.943229 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.943278 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.943293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.943313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4845]: I0202 10:32:55.943407 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.047946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.048032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.048063 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.048094 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.048114 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.152485 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.152526 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.152535 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.152551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.152561 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.254973 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.255040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.255062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.255090 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.255112 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.357827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.357926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.357950 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.358015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.358038 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.460764 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.460809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.460823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.460844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.460859 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.562701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.562743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.562754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.562771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.562782 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.665740 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.665801 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.665819 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.665843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.665860 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.707211 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 13:08:51.824010733 +0000 UTC Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.712599 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:56 crc kubenswrapper[4845]: E0202 10:32:56.712777 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.767941 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.767987 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.767999 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.768015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.768028 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.871469 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.871540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.871559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.871585 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.871605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.974369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.974403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.974411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.974424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4845]: I0202 10:32:56.974433 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.076934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.077007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.077025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.077051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.077069 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.179684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.179731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.179747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.179768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.179783 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.254394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.254436 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.254448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.254463 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.254475 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.275954 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.283677 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.283772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.283787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.283804 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.283823 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.302824 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.309935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.309987 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.310005 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.310031 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.310048 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.329021 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.333744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.333795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.333810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.333829 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.333842 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.348217 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.352308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.352391 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.352417 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.352494 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.352519 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.372578 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.372768 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.374576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.374603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.374615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.374631 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.374644 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.477806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.477863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.477881 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.477942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.477960 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.580434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.580496 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.580513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.580537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.580554 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.683535 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.683582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.683601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.683626 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.683646 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.708326 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:40:18.161965172 +0000 UTC Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.711608 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.711681 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.711691 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.711767 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.711826 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:57 crc kubenswrapper[4845]: E0202 10:32:57.711878 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.785802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.785919 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.785945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.785974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.785996 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.888692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.888755 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.888779 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.888810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.888833 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.925916 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.948206 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.971451 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993106 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:57Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993651 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4845]: I0202 10:32:57.993788 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.013096 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.030082 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.050444 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.069088 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.087317 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.096985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.097046 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.097073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.097103 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.097125 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.103802 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.136269 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.153342 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.170460 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.188510 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.200362 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.200423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.200446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.200478 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.200501 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.209099 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.230743 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.254457 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.285234 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.303584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.303617 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.303628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.303645 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.303656 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.319236 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:58Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.407231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.407273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.407283 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.407304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.407318 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.509983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.510048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.510070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.510100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.510121 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.621242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.621296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.621315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.621338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.621355 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.709265 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 11:07:19.277626215 +0000 UTC Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.712545 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:32:58 crc kubenswrapper[4845]: E0202 10:32:58.712666 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.724230 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.724374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.724397 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.724424 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.724443 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.831108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.831236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.831263 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.831292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.831312 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.934224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.934280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.934296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.934318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4845]: I0202 10:32:58.934337 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.037966 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.038030 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.038049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.038073 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.038090 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.141411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.141473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.141501 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.141546 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.141593 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.245044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.245121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.245144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.245173 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.245195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.348243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.348321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.348347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.348376 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.348400 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.451289 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.451345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.451363 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.451387 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.451405 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.554657 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.554763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.554789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.554820 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.554844 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.657400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.657446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.657457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.657474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.657487 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.710057 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 10:09:01.018661852 +0000 UTC Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.711588 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:59 crc kubenswrapper[4845]: E0202 10:32:59.711709 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.711796 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.711834 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:59 crc kubenswrapper[4845]: E0202 10:32:59.711922 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:59 crc kubenswrapper[4845]: E0202 10:32:59.711981 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.728579 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.745452 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.760076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.760121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.760133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.760152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.760164 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.763273 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.779335 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.793282 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.811238 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.826105 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.836676 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.850766 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.860875 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.862024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.862106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.862121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.862184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.862199 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.876473 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.892305 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.906163 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.922469 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.945772 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.965367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.965476 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.965502 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.965572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.965595 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.968010 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:59 crc kubenswrapper[4845]: I0202 10:32:59.983727 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:59Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.011134 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:00Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.069255 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.069328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.069354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.069385 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.069408 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.172474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.172544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.172564 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.172588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.172607 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.275805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.275873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.275924 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.275955 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.276104 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.379291 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.379350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.379369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.379394 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.379410 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.490193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.490264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.490298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.490339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.490363 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.592986 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.593050 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.593074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.593128 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.593154 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.696447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.696485 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.696497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.696512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.696524 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.712045 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:53:05.161931636 +0000 UTC Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.712239 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:00 crc kubenswrapper[4845]: E0202 10:33:00.712394 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.800040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.800094 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.800111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.800137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.800154 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.903028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.903121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.903147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.903177 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4845]: I0202 10:33:00.903198 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.006140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.006188 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.006197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.006210 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.006218 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.108220 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.108312 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.108320 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.108335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.108344 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.211428 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.211493 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.211517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.211548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.211570 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.314299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.314432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.314445 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.314464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.314477 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.417429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.417503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.417515 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.417530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.417541 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.522458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.522507 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.522518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.522537 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.522548 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.625112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.625168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.625185 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.625208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.625227 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.711829 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.711915 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.711829 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:01 crc kubenswrapper[4845]: E0202 10:33:01.712071 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.712169 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:02:51.17081876 +0000 UTC Feb 02 10:33:01 crc kubenswrapper[4845]: E0202 10:33:01.712269 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:01 crc kubenswrapper[4845]: E0202 10:33:01.712356 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.727101 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.727146 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.727161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.727179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.727193 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.829963 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.830021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.830039 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.830067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.830087 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.933014 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.933088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.933106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.933129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4845]: I0202 10:33:01.933148 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.035928 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.035964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.035973 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.036012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.036031 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.137675 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.137708 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.137718 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.137733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.137744 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.240248 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.240324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.240344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.240370 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.240447 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.342384 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.342439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.342458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.342481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.342498 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.445548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.445612 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.445621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.445638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.445648 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.548381 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.548409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.548418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.548430 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.548439 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.651151 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.651209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.651225 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.651247 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.651264 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.732245 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:08:15.500226477 +0000 UTC Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.732417 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:02 crc kubenswrapper[4845]: E0202 10:33:02.732635 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.733684 4845 scope.go:117] "RemoveContainer" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" Feb 02 10:33:02 crc kubenswrapper[4845]: E0202 10:33:02.733994 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.760951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.761008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.761019 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.761037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.761047 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.863323 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.863374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.863388 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.863407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.863426 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.966692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.966736 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.966750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.966768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4845]: I0202 10:33:02.966779 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.069054 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.069102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.069116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.069132 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.069143 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.173041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.173119 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.173144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.173172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.173192 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.275647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.275690 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.275701 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.275720 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.275732 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.379012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.379058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.379070 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.379089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.379099 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.481844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.481913 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.481924 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.481942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.481956 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.584570 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.584611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.584621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.584634 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.584642 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.687102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.687144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.687157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.687172 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.687184 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.711902 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.712006 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.712133 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:03 crc kubenswrapper[4845]: E0202 10:33:03.712122 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:03 crc kubenswrapper[4845]: E0202 10:33:03.712230 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:03 crc kubenswrapper[4845]: E0202 10:33:03.712337 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.732672 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 02:44:19.041573 +0000 UTC Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.789826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.789857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.789868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.789897 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.789907 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.892200 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.892265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.892288 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.892312 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.892329 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.994760 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.994806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.994817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.994833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4845]: I0202 10:33:03.994847 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.097947 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.097994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.098005 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.098024 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.098037 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.201414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.201473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.201492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.201517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.201535 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.304505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.304541 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.304551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.304566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.304577 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.407263 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.407302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.407310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.407325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.407335 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.509068 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.509102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.509112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.509124 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.509132 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.612075 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.612127 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.612138 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.612157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.612170 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.711726 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:04 crc kubenswrapper[4845]: E0202 10:33:04.711861 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.714645 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.714681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.714693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.714709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.714722 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.732990 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 22:08:10.119291042 +0000 UTC Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.816811 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.816859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.816873 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.816908 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.816922 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.919846 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.919927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.919945 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.919969 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:04 crc kubenswrapper[4845]: I0202 10:33:04.919988 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:04Z","lastTransitionTime":"2026-02-02T10:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.022674 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.022722 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.022732 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.022747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.022756 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.125836 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.125868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.125880 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.125911 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.125922 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.228016 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.228074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.228086 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.228102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.228113 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.331141 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.331193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.331202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.331215 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.331224 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.433079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.433126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.433139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.433157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.433169 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.535757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.535788 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.535797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.535810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.535819 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.638423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.638466 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.638487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.638505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.638769 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.711873 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.711917 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:05 crc kubenswrapper[4845]: E0202 10:33:05.712017 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.712172 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:05 crc kubenswrapper[4845]: E0202 10:33:05.712223 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:05 crc kubenswrapper[4845]: E0202 10:33:05.712386 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.733629 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:16:35.856213689 +0000 UTC Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.742237 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.742272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.742284 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.742300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.742312 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.845439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.845468 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.845476 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.845488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.845496 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.948520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.948572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.948582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.948598 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:05 crc kubenswrapper[4845]: I0202 10:33:05.948607 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:05Z","lastTransitionTime":"2026-02-02T10:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.051556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.051586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.051594 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.051608 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.051617 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.154573 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.154636 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.154659 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.154689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.154712 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.257547 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.257595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.257607 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.257624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.257636 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.360064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.360100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.360110 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.360125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.360138 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.462707 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.462778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.462802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.462861 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.462923 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.565642 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.565733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.565757 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.565783 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.565800 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.624406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:06 crc kubenswrapper[4845]: E0202 10:33:06.624567 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:33:06 crc kubenswrapper[4845]: E0202 10:33:06.624638 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.624618489 +0000 UTC m=+99.716019939 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.668920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.668971 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.668984 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.669004 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.669020 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.712549 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:06 crc kubenswrapper[4845]: E0202 10:33:06.712786 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.734104 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 23:42:21.230345551 +0000 UTC Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.771467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.771517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.771532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.771549 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.771562 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.873925 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.873964 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.873974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.873989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.873999 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.976548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.976594 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.976606 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.976621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:06 crc kubenswrapper[4845]: I0202 10:33:06.976632 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:06Z","lastTransitionTime":"2026-02-02T10:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.079038 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.079090 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.079101 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.079119 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.079130 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.181792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.181859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.181881 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.181958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.181982 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.284775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.284839 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.284859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.284876 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.284904 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.387354 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.387404 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.387413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.387429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.387440 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.489743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.489782 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.489792 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.489807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.489816 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.592521 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.592580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.592598 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.592620 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.592638 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.686672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.686713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.686721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.686735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.686746 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.705705 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712338 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712406 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.712462 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.712498 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712655 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.712842 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.713074 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.713213 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.726733 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.733498 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.734655 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 19:07:17.037149988 +0000 UTC Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.737576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.737748 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.737918 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.738058 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.738193 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.755411 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.759609 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.759640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.759649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.759663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.759673 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.775183 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.779079 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.779376 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.779557 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.779736 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.779961 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.797076 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:07Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:07 crc kubenswrapper[4845]: E0202 10:33:07.797324 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.799375 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.799421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.799431 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.799448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.799459 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.901948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.902008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.902025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.902048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:07 crc kubenswrapper[4845]: I0202 10:33:07.902067 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:07Z","lastTransitionTime":"2026-02-02T10:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.005640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.005689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.005703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.005720 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.005734 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.107775 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.107810 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.107823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.107838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.107849 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.210369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.210409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.210435 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.210450 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.210460 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.312946 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.312986 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.312998 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.313012 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.313021 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.415709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.415859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.415909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.415942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.415962 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.518358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.518407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.518415 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.518429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.518439 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.621325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.621349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.621358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.621370 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.621378 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.712359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:08 crc kubenswrapper[4845]: E0202 10:33:08.712536 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.724336 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.724406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.724429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.724457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.724480 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.736870 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 23:18:50.192625871 +0000 UTC Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.827000 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.827055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.827074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.827100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.827117 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.929863 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.930071 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.930098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.930126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:08 crc kubenswrapper[4845]: I0202 10:33:08.930145 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:08Z","lastTransitionTime":"2026-02-02T10:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.033174 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.033222 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.033234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.033250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.033260 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.136196 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.136236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.136245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.136261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.136271 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.198855 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/0.log" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.198916 4845 generic.go:334] "Generic (PLEG): container finished" podID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" containerID="2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a" exitCode=1 Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.198941 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerDied","Data":"2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.199282 4845 scope.go:117] "RemoveContainer" containerID="2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.216938 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.229975 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.241328 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.241362 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.241372 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.241387 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.241397 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.243821 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.255045 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.268193 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.280227 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.295037 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.312333 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.346778 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.347020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.347165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.347313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.347385 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.351658 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.383768 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.397517 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.409927 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.422998 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.435779 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.448905 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.449519 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.449627 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.449709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.449797 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.449899 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.463595 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.473750 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.485721 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.498057 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.553020 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.553067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.553080 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.553097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.553111 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.655306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.655351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.655360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.655374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.655383 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.712098 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.712113 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:09 crc kubenswrapper[4845]: E0202 10:33:09.712210 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:09 crc kubenswrapper[4845]: E0202 10:33:09.712297 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.712113 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:09 crc kubenswrapper[4845]: E0202 10:33:09.712618 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.728164 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.737802 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 01:01:53.985907649 +0000 UTC Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.740824 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.757831 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.757864 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.757872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.757903 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.757912 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.759594 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.776483 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.789304 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.801229 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.823865 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.840687 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.855842 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.859840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.860010 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.860076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.860147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.860208 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.878203 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.891277 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.903999 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.915707 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.928716 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.939262 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.951686 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965143 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965165 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965180 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:09Z","lastTransitionTime":"2026-02-02T10:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.965923 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.982566 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:09 crc kubenswrapper[4845]: I0202 10:33:09.993113 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.067176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.067218 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.067230 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.067246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.067258 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.169075 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.169116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.169125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.169142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.169177 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.202708 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/0.log" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.202756 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerStarted","Data":"a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.220061 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.236224 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.252308 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.265455 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.271406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.271454 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.271483 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.271510 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.271529 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.277626 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.292212 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.304226 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.315838 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.328231 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.342187 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.353188 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.363780 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.374123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.374157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.374167 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.374180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.374190 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.375111 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.384807 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.395029 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.406518 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.433481 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.462302 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.476108 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.476145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.476155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.476170 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.476180 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.479452 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.578998 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.579062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.579133 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.579191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.579210 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.681349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.681377 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.681385 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.681398 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.681409 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.712108 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:10 crc kubenswrapper[4845]: E0202 10:33:10.712355 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.738705 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:36:59.923255645 +0000 UTC Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.784639 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.784693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.784711 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.784735 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.784751 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.887202 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.887259 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.887275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.887296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.887309 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.989793 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.989832 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.989840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.989853 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:10 crc kubenswrapper[4845]: I0202 10:33:10.989862 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:10Z","lastTransitionTime":"2026-02-02T10:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.092221 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.092299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.092371 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.092447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.092474 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.195518 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.195560 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.195572 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.195588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.195599 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.298252 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.298304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.298322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.298344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.298360 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.401182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.401228 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.401244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.401267 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.401284 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.504423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.504465 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.504474 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.504488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.504498 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.607048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.607122 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.607140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.607163 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.607180 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.710841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.710948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.710974 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.711006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.711027 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.711827 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.711830 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.711831 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:11 crc kubenswrapper[4845]: E0202 10:33:11.712157 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:11 crc kubenswrapper[4845]: E0202 10:33:11.712004 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:11 crc kubenswrapper[4845]: E0202 10:33:11.712233 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.739064 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:34:28.252021605 +0000 UTC Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.813566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.813628 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.813645 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.813673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.813689 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.916101 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.916140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.916191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.916224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:11 crc kubenswrapper[4845]: I0202 10:33:11.916237 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:11Z","lastTransitionTime":"2026-02-02T10:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.019015 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.019083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.019098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.019114 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.019126 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.121499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.121578 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.121595 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.121650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.121667 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.224624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.224686 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.224705 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.224728 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.224746 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.326841 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.326900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.326913 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.326931 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.326944 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.430271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.430315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.430330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.430350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.430364 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.532866 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.532918 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.532934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.532952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.532963 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.635721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.635771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.635788 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.635811 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.635827 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.711529 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:12 crc kubenswrapper[4845]: E0202 10:33:12.711658 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.738840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.738900 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.738909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.738926 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.738934 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.739355 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 17:32:02.624850816 +0000 UTC Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.842592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.842662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.842680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.842705 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.842722 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.945662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.945725 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.945737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.945752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:12 crc kubenswrapper[4845]: I0202 10:33:12.945763 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:12Z","lastTransitionTime":"2026-02-02T10:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.047772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.047816 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.047826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.047844 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.047855 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.150407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.150470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.150487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.150512 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.150531 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.252611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.252675 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.252689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.252708 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.252720 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.355236 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.355285 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.355302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.355321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.355335 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.458356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.458409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.458427 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.458449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.458466 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.560769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.560816 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.560828 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.560850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.560861 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.664087 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.664151 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.664166 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.664182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.664195 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.711926 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.711962 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.712033 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:13 crc kubenswrapper[4845]: E0202 10:33:13.712110 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:13 crc kubenswrapper[4845]: E0202 10:33:13.712189 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:13 crc kubenswrapper[4845]: E0202 10:33:13.712343 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.740217 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:02:56.150140507 +0000 UTC Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.767002 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.767060 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.767081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.767109 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.767126 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.869988 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.870085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.870114 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.870140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.870160 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.980983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.981030 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.981038 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.981051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:13 crc kubenswrapper[4845]: I0202 10:33:13.981061 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:13Z","lastTransitionTime":"2026-02-02T10:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.084154 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.084214 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.084231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.084254 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.084271 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.189587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.189657 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.189680 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.189709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.189733 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.292726 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.292779 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.292796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.292817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.292834 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.396265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.396318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.396335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.396358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.396376 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.498667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.498781 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.498806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.498836 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.498858 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.601193 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.601245 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.601260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.601279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.601291 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.703958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.704235 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.704341 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.704478 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.704576 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.711691 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:14 crc kubenswrapper[4845]: E0202 10:33:14.711802 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.712803 4845 scope.go:117] "RemoveContainer" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.741116 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 06:52:13.181326718 +0000 UTC Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.807535 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.807879 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.808085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.808262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.808421 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.912021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.912081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.912098 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.912123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:14 crc kubenswrapper[4845]: I0202 10:33:14.912140 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:14Z","lastTransitionTime":"2026-02-02T10:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.015294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.015637 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.015951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.016089 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.016196 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.119852 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.119948 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.119966 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.119990 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.120006 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.219502 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/2.log" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.221576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.221848 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.222100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.222308 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.222455 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.224540 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.225288 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.247607 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.279694 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.300127 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.325253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.325294 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.325305 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.325319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.325328 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.329791 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.345505 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.375321 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.395535 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.417146 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.427231 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.427260 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.427271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.427299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.427307 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.428894 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.439998 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.452358 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.466598 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.476426 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.490691 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.500546 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.514194 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529312 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529349 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529371 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529380 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.529766 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.540782 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.552393 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.632317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.632357 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.632371 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.632388 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.632398 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.712017 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:15 crc kubenswrapper[4845]: E0202 10:33:15.712232 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.712615 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:15 crc kubenswrapper[4845]: E0202 10:33:15.712775 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.713253 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:15 crc kubenswrapper[4845]: E0202 10:33:15.713403 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.736001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.736112 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.736136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.736168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.736189 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.742478 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:12:33.186604144 +0000 UTC Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.839214 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.839288 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.839307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.839807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.839866 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.943638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.943728 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.943746 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.943772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:15 crc kubenswrapper[4845]: I0202 10:33:15.943788 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:15Z","lastTransitionTime":"2026-02-02T10:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.047420 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.047826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.048109 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.048299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.048460 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.151373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.151429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.151446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.151467 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.151483 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.230735 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/3.log" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.231783 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/2.log" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.236271 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" exitCode=1 Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.236335 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.236400 4845 scope.go:117] "RemoveContainer" containerID="939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.237534 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:33:16 crc kubenswrapper[4845]: E0202 10:33:16.237915 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.254301 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.254367 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.254385 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.254409 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.254426 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.273628 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.295943 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.316581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.351117 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://939954f04b6266a3da83d0b6bbe32dd51b52c2121163e582010da34e5023fe77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:48Z\\\",\\\"message\\\":\\\"moval\\\\nI0202 10:32:48.630958 6609 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:32:48.630917 6609 handler.go:208] Removed *v1.Node event handler 7\\\\nI0202 10:32:48.631048 6609 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0202 10:32:48.631090 6609 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0202 10:32:48.631121 6609 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:32:48.631188 6609 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:32:48.631221 6609 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0202 10:32:48.631268 6609 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:32:48.631400 6609 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631470 6609 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:32:48.631530 6609 factory.go:656] Stopping watch factory\\\\nI0202 10:32:48.631542 6609 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0202 10:32:48.631633 6609 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:48.631565 6609 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:32:48.631714 6609 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:48.631814 6609 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:15Z\\\",\\\"message\\\":\\\":303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771107 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771126 7022 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd in node crc\\\\nI0202 10:33:15.771141 7022 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd after 0 failed attempt(s)\\\\nI0202 10:33:15.771152 7022 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771151 7022 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771173 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771184 7022 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xdtrh in node crc\\\\nF0202 10:33:15.771184 7022 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.357967 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.358025 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.358042 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.358082 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.358100 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.371157 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.392793 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.412515 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.435581 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.450918 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.461228 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.461276 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.461293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.461317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.461334 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.465736 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.481032 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.498188 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.517757 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.533881 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.554332 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.565139 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.565198 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.565217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.565242 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.565260 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.574746 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.592621 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.615447 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.633006 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:16Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.669067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.669150 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.669171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.669196 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.669214 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.711638 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:16 crc kubenswrapper[4845]: E0202 10:33:16.711827 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.743447 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:28:04.908610689 +0000 UTC Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.772400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.772447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.772460 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.772479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.772496 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.875116 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.875211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.875240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.875298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.875323 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.980727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.980802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.980825 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.980859 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:16 crc kubenswrapper[4845]: I0202 10:33:16.980922 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:16Z","lastTransitionTime":"2026-02-02T10:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.084152 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.084250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.084280 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.084318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.084346 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.186957 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.187002 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.187011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.187026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.187039 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.242660 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/3.log" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.247404 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.247718 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.261777 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.276020 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.290542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.290614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.290635 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.290662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.290681 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.294028 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.307652 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.322602 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.339261 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.350048 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.360693 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.373257 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.382821 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.393958 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.394345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.394389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.394423 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.394443 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.394455 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.406632 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.427416 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.440825 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.455727 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.483033 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:15Z\\\",\\\"message\\\":\\\":303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771107 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771126 7022 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd in node crc\\\\nI0202 10:33:15.771141 7022 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd after 0 failed attempt(s)\\\\nI0202 10:33:15.771152 7022 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771151 7022 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771173 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771184 7022 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xdtrh in node crc\\\\nF0202 10:33:15.771184 7022 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.497668 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.498040 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.498095 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.498106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.498120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.498132 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.545197 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.558279 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.601234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.601264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.601273 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.601286 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.601298 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.703733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.703769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.703780 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.703795 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.703807 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.711696 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.711835 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.711938 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.712004 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.711746 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.712210 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.743997 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:03:15.562262504 +0000 UTC Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.806620 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.806673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.806687 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.806703 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.806716 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.908716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.908761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.908770 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.908787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.908799 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.913789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.913847 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.913865 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.913925 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.913942 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.933794 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.938155 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.938189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.938197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.938211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.938221 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.955018 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.960398 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.960458 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.960470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.960485 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.960496 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.978208 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.981576 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.981609 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.981620 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.981635 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:17 crc kubenswrapper[4845]: I0202 10:33:17.981645 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:17Z","lastTransitionTime":"2026-02-02T10:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:17 crc kubenswrapper[4845]: E0202 10:33:17.992925 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:17Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.001635 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.001691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.001710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.001734 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.001754 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: E0202 10:33:18.014429 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:18Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:18 crc kubenswrapper[4845]: E0202 10:33:18.014714 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.016708 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.016752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.016769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.016791 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.016806 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.119909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.119958 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.119970 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.119987 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.119998 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.223026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.223056 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.223065 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.223076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.223085 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.326617 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.326681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.326694 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.326713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.326726 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.429307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.429400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.429453 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.429480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.429498 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.531126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.531178 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.531189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.531207 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.531218 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.634524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.634587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.634603 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.634626 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.634644 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.712237 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:18 crc kubenswrapper[4845]: E0202 10:33:18.712412 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.737505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.737556 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.737578 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.737604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.737625 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.745020 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 12:17:37.50405896 +0000 UTC Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.839937 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.839973 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.839983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.840002 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.840012 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.943525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.943580 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.943601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.943630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:18 crc kubenswrapper[4845]: I0202 10:33:18.943650 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:18Z","lastTransitionTime":"2026-02-02T10:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.046848 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.046954 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.046978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.047007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.047029 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.150417 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.150479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.150495 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.150520 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.150536 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.253217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.253272 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.253287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.253309 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.253325 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.356069 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.356135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.356144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.356156 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.356165 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.459255 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.459322 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.459339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.459364 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.459379 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.562348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.562714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.562882 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.563106 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.563258 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.666724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.666756 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.666772 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.666789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.666802 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.712086 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.712088 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:19 crc kubenswrapper[4845]: E0202 10:33:19.712341 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:19 crc kubenswrapper[4845]: E0202 10:33:19.712224 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.712108 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:19 crc kubenswrapper[4845]: E0202 10:33:19.712429 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.728196 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.741277 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.745182 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:01:28.621247827 +0000 UTC Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.763400 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.768487 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.768517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.768528 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.768542 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.768554 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.775532 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.787075 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.802913 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.817092 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.831369 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.847412 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.863484 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.870183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.870230 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.870240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.870254 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.870263 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.878447 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.889950 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.906360 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.936726 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:15Z\\\",\\\"message\\\":\\\":303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771107 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771126 7022 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd in node crc\\\\nI0202 10:33:15.771141 7022 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd after 0 failed attempt(s)\\\\nI0202 10:33:15.771152 7022 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771151 7022 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771173 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771184 7022 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xdtrh in node crc\\\\nF0202 10:33:15.771184 7022 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.959576 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.972566 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.972612 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.972624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.972640 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.972651 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:19Z","lastTransitionTime":"2026-02-02T10:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.977027 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:19 crc kubenswrapper[4845]: I0202 10:33:19.992434 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:19Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.013038 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.036946 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:20Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.075531 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.075581 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.075608 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.075625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.075636 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.177368 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.177446 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.177472 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.177502 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.177523 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.280007 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.280078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.280091 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.280111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.280124 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.383358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.383433 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.383457 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.383491 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.383514 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.486784 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.486834 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.486850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.486871 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.486913 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.590033 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.590078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.590094 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.590117 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.590134 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.693008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.693078 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.693097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.693123 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.693141 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.712359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:20 crc kubenswrapper[4845]: E0202 10:33:20.712597 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.746120 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 18:03:22.232770698 +0000 UTC Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.795993 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.796061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.796083 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.796115 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.796137 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.899268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.899333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.899351 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.899374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:20 crc kubenswrapper[4845]: I0202 10:33:20.899391 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:20Z","lastTransitionTime":"2026-02-02T10:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.002823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.002874 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.002927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.002951 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.002968 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.106191 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.106244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.106265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.106290 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.106312 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.209144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.209217 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.209241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.209269 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.209291 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.312324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.312381 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.312399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.312421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.312438 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.415453 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.415524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.415554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.415587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.415609 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.519127 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.519173 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.519190 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.519207 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.519217 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.622141 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.622222 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.622240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.622266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.622282 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.712139 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.712225 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:21 crc kubenswrapper[4845]: E0202 10:33:21.712313 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.712380 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:21 crc kubenswrapper[4845]: E0202 10:33:21.712530 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:21 crc kubenswrapper[4845]: E0202 10:33:21.712724 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.725161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.725194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.725206 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.725219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.725228 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.746560 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:39:48.848883117 +0000 UTC Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.829439 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.829489 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.829503 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.829525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.829537 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.932359 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.932400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.932416 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.932434 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:21 crc kubenswrapper[4845]: I0202 10:33:21.932447 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:21Z","lastTransitionTime":"2026-02-02T10:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.036257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.036326 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.036348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.036375 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.036396 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.141774 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.141823 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.141837 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.141854 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.141869 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.244709 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.244745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.244754 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.244768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.244778 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.348144 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.348214 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.348240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.348271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.348292 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.451689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.452313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.452333 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.452347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.452356 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.555956 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.556047 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.556072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.556099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.556120 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.659342 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.659428 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.659462 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.659506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.659548 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.712352 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:22 crc kubenswrapper[4845]: E0202 10:33:22.712541 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.747721 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:33:43.553612973 +0000 UTC Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.762140 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.762176 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.762184 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.762197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.762206 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.865157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.865233 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.865256 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.865285 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.865306 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.968559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.968652 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.968681 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.968706 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:22 crc kubenswrapper[4845]: I0202 10:33:22.968724 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:22Z","lastTransitionTime":"2026-02-02T10:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.072353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.072429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.072455 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.072484 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.072505 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.175713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.175777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.175789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.175808 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.175822 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.278257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.278295 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.278306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.278321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.278331 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.381248 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.381311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.381327 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.381350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.381367 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.484806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.484860 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.484877 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.484928 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.484945 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.588339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.588400 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.588419 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.588442 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.588459 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.691492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.691558 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.691577 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.691602 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.691620 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.700202 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.700335 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700408 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.70037815 +0000 UTC m=+148.791779630 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700480 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700504 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700523 4845 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700579 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.700562066 +0000 UTC m=+148.791963556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700604 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700628 4845 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700644 4845 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.700496 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700695 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.70068023 +0000 UTC m=+148.792081720 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.700782 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.700832 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.700945 4845 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.701003 4845 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.701013 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.700995299 +0000 UTC m=+148.792396779 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.701116 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.701090112 +0000 UTC m=+148.792491592 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.712475 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.712515 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.712523 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.712630 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.712759 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:23 crc kubenswrapper[4845]: E0202 10:33:23.713018 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.748506 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:38:03.137750613 +0000 UTC Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.794240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.794298 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.794315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.794339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.794357 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.896126 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.896171 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.896180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.896192 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.896201 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.998947 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.999011 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.999028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.999053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:23 crc kubenswrapper[4845]: I0202 10:33:23.999070 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:23Z","lastTransitionTime":"2026-02-02T10:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.102190 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.102250 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.102270 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.102293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.102310 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.205232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.205310 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.205350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.205382 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.205406 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.308251 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.308304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.308314 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.308332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.308344 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.410952 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.411032 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.411050 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.411074 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.411092 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.514440 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.514492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.514504 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.514527 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.514541 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.618356 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.618403 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.618413 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.618429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.618442 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.712438 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:24 crc kubenswrapper[4845]: E0202 10:33:24.712605 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.721332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.721393 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.721408 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.721429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.721443 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.749103 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:25:20.058714981 +0000 UTC Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.823983 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.824044 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.824061 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.824085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.824104 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.926851 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.926940 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.926959 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.926985 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:24 crc kubenswrapper[4845]: I0202 10:33:24.927004 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:24Z","lastTransitionTime":"2026-02-02T10:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.029658 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.029724 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.029741 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.029764 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.029782 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.132219 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.132256 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.132266 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.132282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.132295 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.234872 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.235002 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.235021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.235056 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.235091 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.338297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.338338 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.338346 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.338359 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.338367 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.441505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.441569 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.441587 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.441611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.441628 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.544271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.544306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.544316 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.544329 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.544337 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.647752 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.647808 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.647827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.647881 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.647935 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.712397 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:25 crc kubenswrapper[4845]: E0202 10:33:25.712570 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.712837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:25 crc kubenswrapper[4845]: E0202 10:33:25.712976 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.713344 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:25 crc kubenswrapper[4845]: E0202 10:33:25.713547 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.749281 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 12:20:02.168759933 +0000 UTC Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.750968 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.751026 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.751037 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.751055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.751067 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.853779 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.853807 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.853817 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.853833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.853844 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.956268 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.956332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.956352 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.956380 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:25 crc kubenswrapper[4845]: I0202 10:33:25.956401 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:25Z","lastTransitionTime":"2026-02-02T10:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.058625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.058676 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.058693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.058713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.058730 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.160789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.160842 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.160858 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.160880 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.160950 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.263666 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.263727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.263743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.263768 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.263785 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.366383 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.366711 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.366805 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.366916 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.367002 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.470258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.470617 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.470714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.470799 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.470913 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.574189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.574226 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.574234 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.574248 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.574257 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.678412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.678499 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.678534 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.678582 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.678607 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.711837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:26 crc kubenswrapper[4845]: E0202 10:33:26.712121 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.749720 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 08:07:19.283324304 +0000 UTC Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.781452 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.781650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.781673 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.781730 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.781752 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.884731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.884804 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.884821 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.884850 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.884870 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.989369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.989448 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.989481 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.989513 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:26 crc kubenswrapper[4845]: I0202 10:33:26.989537 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:26Z","lastTransitionTime":"2026-02-02T10:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.092789 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.092846 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.092868 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.092924 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.092948 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.195524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.195604 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.195633 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.195667 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.195687 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.298208 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.298265 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.298282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.298304 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.298321 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.405492 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.405561 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.405590 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.405621 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.405646 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.508421 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.508488 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.508528 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.508559 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.508585 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.612088 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.612129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.612138 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.612151 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.612162 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.714097 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.714297 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.714387 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:27 crc kubenswrapper[4845]: E0202 10:33:27.714622 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:27 crc kubenswrapper[4845]: E0202 10:33:27.714827 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:27 crc kubenswrapper[4845]: E0202 10:33:27.714972 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.716632 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.716684 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.716713 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.716738 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.716755 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.750532 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 19:26:55.169598552 +0000 UTC Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.819935 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.820036 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.820054 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.820076 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.820092 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.971934 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.972064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.972085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.972113 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:27 crc kubenswrapper[4845]: I0202 10:33:27.972133 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:27Z","lastTransitionTime":"2026-02-02T10:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.074551 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.074590 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.074601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.074615 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.074628 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.178157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.178255 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.178274 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.178691 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.178963 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.280668 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.280721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.280743 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.280769 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.280786 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.302645 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.307994 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.308053 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.308069 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.308093 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.308110 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.326981 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.331989 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.332048 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.332068 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.332092 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.332109 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.352774 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.358179 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.358253 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.358271 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.358297 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.358314 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.379465 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.384727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.384788 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.384806 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.384830 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.384848 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.405278 4845 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"8f18ce78-9cc3-4dbd-9d49-5987790a156d\\\",\\\"systemUUID\\\":\\\"a0f7ad40-dfc7-4c48-b08f-9dc9799ca728\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:28Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.405610 4845 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.407699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.407761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.407785 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.407813 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.407838 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.510990 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.511045 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.511062 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.511084 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.511101 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.614319 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.614369 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.614390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.614417 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.614435 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.711635 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:28 crc kubenswrapper[4845]: E0202 10:33:28.711823 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.717683 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.717744 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.717763 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.717786 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.717804 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.751133 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 06:21:55.246155615 +0000 UTC Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.820833 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.820878 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.820909 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.820927 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.820941 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.923664 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.923745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.923766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.923790 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:28 crc kubenswrapper[4845]: I0202 10:33:28.923823 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:28Z","lastTransitionTime":"2026-02-02T10:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.026662 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.026715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.026733 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.026755 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.026772 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.130262 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.130321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.130339 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.130365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.130383 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.233415 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.233486 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.233497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.233516 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.233527 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.337001 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.337064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.337081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.337105 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.337123 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.439547 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.439605 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.439618 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.439638 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.439652 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.542761 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.542826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.542843 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.542867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.542919 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.645610 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.645660 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.645672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.645689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.645706 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.712403 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.712414 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:29 crc kubenswrapper[4845]: E0202 10:33:29.712637 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.712681 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:29 crc kubenswrapper[4845]: E0202 10:33:29.712811 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:29 crc kubenswrapper[4845]: E0202 10:33:29.712963 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.733550 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.750335 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.750398 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.750414 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.750437 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.750455 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.751475 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 15:35:15.860076465 +0000 UTC Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.754774 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.779724 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-thbz4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01f1334f-a21c-4487-a1d6-dbecf7017c59\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1c279629fb253d48dc16d3135455acf6898e13d9c6c1abdbbba6745bf444271\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86e833c4ffa4adc1ef1e3436117fc1c7bfc71ccdaa6a33138f70316fa4aa148c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94f90971c998191dab0c4c9bd5c32505ad7d59936f1278bbeadca6d396515e3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f687cea8c04a18db685e70c1f08a6a39c133fd1457894abbc30378dc55451e42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://57e844982b527eb7d6e332e02b3103b17356b972ae995785c2b77c4da9d4cfe0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d80fba12e96c4d6bf74ce0f6eaa569bcc36fbe1f37ab194a1349a00647373c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a0c97b12b3f63a2cf8e758064a8f223db848e8e59b7ae091a1c7f6c6cfaf233\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k9h9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-thbz4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.797571 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v5d9c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pmn9h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.818417 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fc2b1ae-676c-4b63-98d0-074ca1549855\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73de557b3db542f15304bf4040cc3d6cb977653435b5554a5997017bddd99f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f6a90efd39510f9e697dda93f9d9df9d66fc4939524d5f63dac2de69eb631be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cbb077b7b9ee7a19d612c668b38abae9c19bc0b77fb1d0e6ead7456b9e4f1778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc810e75835737db586cd41987af98c1b2aaf409fc667b10b0c793f925e1ce20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.830638 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0555b7db-8395-4133-bcbb-cc7b913aa8d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77147984edf194643baed82a9c8ff33e195c8f0fd949be89036668931346dc84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://970e304480db2ba449cc9b741e5e43e49e2b4644b1d7361f680de0487fb489cb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.845252 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-kzwst" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:33:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:08Z\\\",\\\"message\\\":\\\"2026-02-02T10:32:23+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574\\\\n2026-02-02T10:32:23+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b74a23ed-1be3-4ccf-8e01-88593520f574 to /host/opt/cni/bin/\\\\n2026-02-02T10:32:23Z [verbose] multus-daemon started\\\\n2026-02-02T10:32:23Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:33:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4ptl6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-kzwst\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.852584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.852624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.852634 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.852650 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.852662 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.856540 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ebf2f253-531f-4835-84c1-928680352f7f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8ba67ef020faaba1b0de9261d9c81a2fb5f7150fd656146120be915d505b9690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc8c6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2wnn9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.869269 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xdtrh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86c7aea7-01dc-4f6d-ab41-94447a76fd6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://520376a20ac322cf8b06027f9eaf3ac45b420bc59dc146e0a0f7d607e4e8b4af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f24s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xdtrh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.885503 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"67f40dda-fb6a-490a-86a1-a14d6b183c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://286c1a22bddd9b49ba922e96d6189249be4b138ad1cbddaa0900380c3ac74fc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b76ba08a019ae3c5baf8eb38326be46e924ae188878727c70a3cdd922978df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zvdnv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-c2rmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.902478 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d70456ccb7609e2b329f3354b6bf22567cf006def4b2d6d3c977a5557610255b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.914855 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rzb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b94e8620-d850-4036-b311-42b2a6369c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56fd0a9441c1ef3ad3648d904b687c0047546fced1f41cdbd684e9a459274ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5szs4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rzb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.928912 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51975473c1a580da686a24af479ba6bdf021aa18374e7a5a5a25697f0e7c1c89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b4c91dd525a59761c418dc5d0da4726c8fb020132487234982f56b7b1baa997\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.949864 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:33:15Z\\\",\\\"message\\\":\\\":303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771107 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771126 7022 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd in node crc\\\\nI0202 10:33:15.771141 7022 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-sh5vd after 0 failed attempt(s)\\\\nI0202 10:33:15.771152 7022 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-sh5vd\\\\nI0202 10:33:15.771151 7022 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771173 7022 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-xdtrh\\\\nI0202 10:33:15.771184 7022 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-xdtrh in node crc\\\\nF0202 10:33:15.771184 7022 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped alrea\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wdgfw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:32:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sh5vd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.955942 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.956009 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.956029 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.956057 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.956076 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:29Z","lastTransitionTime":"2026-02-02T10:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.974842 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7734a376-782c-48c5-a900-6848db158367\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cbaa3d016ffb0dcfac62822079d7d6107d9c83d148fced65d10a499ecc59a209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e16c171df9a3b92833d4b9ce6a266e61e1e395be8718a8f0ad17d91e2f81bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e233a2094abb67e4681a879c9833ebfab622f3d474c626eedb84c64bfd472267\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4d7c63b5e4c1ff659c55062d96136f9e1da6aa5b387b9a5c827a214a08d20b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffd643db8d775b30f6af31b66cbf14a3c69d5f905b259f8cc7278cdfdbb47fcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b28b6ef2595e94d53201f95e0ee15ca846f46f180304e060e81c80b286594676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c82afeaee19739204c15badd7f3c696894d6a55ad3864e5fe8f0ffe8513abe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1292bccaf8d5a2245d414fceb61f4d986778c4609b3809b6242afe6eae000db3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:02Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:29 crc kubenswrapper[4845]: I0202 10:33:29.994820 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:32:19.778605 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:32:19.778698 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:32:19.779308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1786026333/tls.crt::/tmp/serving-cert-1786026333/tls.key\\\\\\\"\\\\nI0202 10:32:20.300305 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:32:20.306387 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:32:20.306405 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:32:20.306428 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:32:20.306438 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:32:20.316852 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:32:20.316876 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:32:20.316899 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:32:20.316904 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:32:20.316907 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:32:20.316910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:32:20.317037 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:32:20.318666 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:02Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:32:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:29Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.013704 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.032971 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60ebd4f2-07f1-4feb-891d-90a28f6bd42b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e142c14a88e83ff651b3bc885d4a686a80009bcb2f8e462676f985525a22a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://555c98f1dd5a685f181dc7a501212953659b1f17682e20c8240891b78572f8ce\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c63b9e73c0108a15bfbdab8ad846fecd1881b4cbe8efe70d4b0590c77d028e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.047511 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dcfcbbb0e3f0e6e8d42cc155396b11719e721ede6a85209e69ac2eede1ac353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:33:30Z is after 2025-08-24T17:21:41Z" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.059224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.059292 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.059313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.059344 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.059373 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.162652 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.162716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.162727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.162745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.162759 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.265444 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.265505 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.265524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.265548 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.265570 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.369125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.369197 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.369243 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.369278 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.369305 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.473479 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.473541 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.473567 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.473598 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.473621 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.576663 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.576715 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.576727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.576745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.576758 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.678699 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.678739 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.678748 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.678762 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.678770 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.712251 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:30 crc kubenswrapper[4845]: E0202 10:33:30.712405 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.752638 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 12:05:58.522633834 +0000 UTC Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.782306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.782372 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.782644 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.782670 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.782688 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.885917 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.885978 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.885996 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.886021 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.886039 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.988669 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.989102 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.989201 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.989293 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:30 crc kubenswrapper[4845]: I0202 10:33:30.989372 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:30Z","lastTransitionTime":"2026-02-02T10:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.092643 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.093028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.093041 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.093097 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.093112 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.195627 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.195693 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.195710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.195736 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.195753 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.298157 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.298508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.298600 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.298689 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.298775 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.401718 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.401771 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.401796 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.401824 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.401849 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.504287 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.504336 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.504353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.504377 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.504399 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.607622 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.607692 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.607714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.607737 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.607753 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.710258 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.710324 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.710350 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.710380 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.710402 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.712548 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.712561 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:31 crc kubenswrapper[4845]: E0202 10:33:31.712793 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.713194 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:31 crc kubenswrapper[4845]: E0202 10:33:31.713251 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:31 crc kubenswrapper[4845]: E0202 10:33:31.713406 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.714330 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:33:31 crc kubenswrapper[4845]: E0202 10:33:31.714569 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.753854 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 06:18:26.305947938 +0000 UTC Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.813524 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.813560 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.813571 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.813586 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.813599 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.917575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.917630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.917647 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.917671 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:31 crc kubenswrapper[4845]: I0202 10:33:31.917687 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:31Z","lastTransitionTime":"2026-02-02T10:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.020453 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.020515 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.020532 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.020554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.020572 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.125300 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.125360 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.125378 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.125402 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.125419 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.229085 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.229135 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.229147 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.229164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.229176 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.331721 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.331766 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.331800 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.331820 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.331831 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.435282 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.435347 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.435366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.435392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.435409 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.538838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.539317 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.539471 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.539648 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.539784 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.642130 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.642182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.642194 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.642213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.642226 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.712101 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:32 crc kubenswrapper[4845]: E0202 10:33:32.712286 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.744517 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.744584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.744601 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.744624 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.744785 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.754946 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 00:24:28.201500379 +0000 UTC Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.847590 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.847660 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.847685 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.847753 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.847774 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.951067 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.951118 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.951136 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.951159 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:32 crc kubenswrapper[4845]: I0202 10:33:32.951175 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:32Z","lastTransitionTime":"2026-02-02T10:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.054539 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.054584 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.054596 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.054611 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.054620 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.158337 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.158406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.158425 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.158451 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.158470 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.262275 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.262311 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.262321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.262336 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.262345 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.365447 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.365508 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.365525 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.365550 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.365567 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.468714 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.468788 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.468812 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.468834 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.468852 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.572213 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.572285 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.572296 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.572312 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.572322 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.675006 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.675084 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.675104 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.675130 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.675149 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.712494 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.712537 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:33 crc kubenswrapper[4845]: E0202 10:33:33.712698 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.712793 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:33 crc kubenswrapper[4845]: E0202 10:33:33.713020 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:33 crc kubenswrapper[4845]: E0202 10:33:33.713106 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.755127 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:44:21.240965445 +0000 UTC Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.778211 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.778279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.778301 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.778325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.778342 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.881313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.881387 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.881406 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.881430 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.881446 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.984981 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.985034 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.985051 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.985075 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:33 crc kubenswrapper[4845]: I0202 10:33:33.985092 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:33Z","lastTransitionTime":"2026-02-02T10:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.088182 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.088261 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.088299 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.088330 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.088358 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.191540 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.191608 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.191625 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.191649 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.191667 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.294353 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.294432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.294449 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.294473 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.294504 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.397578 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.397618 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.397630 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.397646 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.397657 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.501142 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.501196 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.501209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.501229 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.501242 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.603954 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.604099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.604120 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.604192 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.604221 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.707244 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.707672 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.707838 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.708111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.708296 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.712596 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:34 crc kubenswrapper[4845]: E0202 10:33:34.712976 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.755529 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 03:30:20.510948105 +0000 UTC Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.811168 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.811223 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.811241 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.811264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.811282 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.914200 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.914306 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.914321 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.914337 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:34 crc kubenswrapper[4845]: I0202 10:33:34.914350 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:34Z","lastTransitionTime":"2026-02-02T10:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.017827 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.018343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.018438 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.018526 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.018599 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.140008 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.140096 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.140118 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.140145 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.140164 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.244161 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.244205 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.244279 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.244307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.244324 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.347141 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.347195 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.347209 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.347232 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.347250 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.449930 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.450004 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.450028 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.450057 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.450079 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.552614 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.552731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.552758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.552787 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.552806 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.655307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.655374 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.655392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.655418 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.655435 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.712336 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.712354 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:35 crc kubenswrapper[4845]: E0202 10:33:35.713036 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.712424 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:35 crc kubenswrapper[4845]: E0202 10:33:35.713208 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:35 crc kubenswrapper[4845]: E0202 10:33:35.712849 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.755668 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:50:06.897482429 +0000 UTC Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.758183 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.758345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.758366 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.758389 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.758425 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.861840 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.861920 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.861938 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.861961 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.861978 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.965343 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.965432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.965464 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.965544 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:35 crc kubenswrapper[4845]: I0202 10:33:35.965605 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:35Z","lastTransitionTime":"2026-02-02T10:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.069164 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.069224 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.069240 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.069264 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.069283 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.172641 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.172710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.172727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.172750 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.172773 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.276057 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.276114 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.276134 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.276156 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.276175 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.379288 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.379365 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.379384 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.379411 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.379429 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.483249 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.483325 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.483348 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.483381 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.483400 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.587064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.587159 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.587180 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.587212 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.587234 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.690064 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.690100 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.690111 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.690125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.690137 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.712499 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:36 crc kubenswrapper[4845]: E0202 10:33:36.712669 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.756300 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 16:59:56.883911131 +0000 UTC Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.792530 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.792575 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.792592 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.792616 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.792633 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.895716 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.895777 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.895798 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.895826 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.895849 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.998257 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.998302 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.998313 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.998331 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:36 crc kubenswrapper[4845]: I0202 10:33:36.998343 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:36Z","lastTransitionTime":"2026-02-02T10:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.101727 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.101764 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.101782 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.101802 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.101813 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.204745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.204809 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.204829 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.204857 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.204874 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.307017 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.307081 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.307099 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.307129 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.307146 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.410345 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.410382 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.410392 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.410407 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.410418 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.513318 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.513758 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.513912 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.514055 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.514182 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.616747 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.617043 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.617137 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.617212 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.617295 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.712631 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:37 crc kubenswrapper[4845]: E0202 10:33:37.713052 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.712683 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:37 crc kubenswrapper[4845]: E0202 10:33:37.713312 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.712632 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:37 crc kubenswrapper[4845]: E0202 10:33:37.713735 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.720936 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.721003 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.721022 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.721049 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.721068 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.756820 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:37:36.887498072 +0000 UTC Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.824390 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.824429 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.824437 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.824470 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.824480 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.927412 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.927480 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.927506 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.927535 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:37 crc kubenswrapper[4845]: I0202 10:33:37.927556 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:37Z","lastTransitionTime":"2026-02-02T10:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.031588 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.031719 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.031745 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.031774 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.031797 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.135072 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.135497 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.135731 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.135949 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.136274 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.239125 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.239189 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.239206 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.239230 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.239250 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.342277 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.342373 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.342432 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.342454 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.342510 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.445337 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.445710 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.445867 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.446121 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.446263 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.523246 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.523315 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.523332 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.523358 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.523378 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.564307 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.564382 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.564399 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.564427 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.564445 4845 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:38Z","lastTransitionTime":"2026-02-02T10:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.593684 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw"] Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.594415 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.598475 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.598604 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.598693 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.598779 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.618564 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=45.618480886 podStartE2EDuration="45.618480886s" podCreationTimestamp="2026-02-02 10:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.617134045 +0000 UTC m=+99.708535495" watchObservedRunningTime="2026-02-02 10:33:38.618480886 +0000 UTC m=+99.709882386" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.651649 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=31.651627263 podStartE2EDuration="31.651627263s" podCreationTimestamp="2026-02-02 10:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.63343465 +0000 UTC m=+99.724836100" watchObservedRunningTime="2026-02-02 10:33:38.651627263 +0000 UTC m=+99.743028713" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.683496 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.684051 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/948fb07c-2db9-45aa-805b-5ba192aae967-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.684220 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.684413 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/948fb07c-2db9-45aa-805b-5ba192aae967-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.684734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.684944 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/948fb07c-2db9-45aa-805b-5ba192aae967-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: E0202 10:33:38.684003 4845 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:33:38 crc kubenswrapper[4845]: E0202 10:33:38.685329 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs podName:84cb7b66-62e7-4012-ab80-7c5e6ba51e35 nodeName:}" failed. No retries permitted until 2026-02-02 10:34:42.685302176 +0000 UTC m=+163.776703656 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs") pod "network-metrics-daemon-pmn9h" (UID: "84cb7b66-62e7-4012-ab80-7c5e6ba51e35") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.711393 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-thbz4" podStartSLOduration=78.711366909 podStartE2EDuration="1m18.711366909s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.695033162 +0000 UTC m=+99.786434652" watchObservedRunningTime="2026-02-02 10:33:38.711366909 +0000 UTC m=+99.802768369" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.712015 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:38 crc kubenswrapper[4845]: E0202 10:33:38.712148 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.734102 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rzb6b" podStartSLOduration=79.734083599 podStartE2EDuration="1m19.734083599s" podCreationTimestamp="2026-02-02 10:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.733908564 +0000 UTC m=+99.825310034" watchObservedRunningTime="2026-02-02 10:33:38.734083599 +0000 UTC m=+99.825485049" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.757085 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:12:22.327232456 +0000 UTC Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.757154 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.766949 4845 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.770775 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kzwst" podStartSLOduration=78.770756514 podStartE2EDuration="1m18.770756514s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.75518147 +0000 UTC m=+99.846582930" watchObservedRunningTime="2026-02-02 10:33:38.770756514 +0000 UTC m=+99.862157964" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.771047 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podStartSLOduration=78.771042372 podStartE2EDuration="1m18.771042372s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.770527367 +0000 UTC m=+99.861928827" watchObservedRunningTime="2026-02-02 10:33:38.771042372 +0000 UTC m=+99.862443822" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785494 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xdtrh" podStartSLOduration=78.7854619 podStartE2EDuration="1m18.7854619s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.785096659 +0000 UTC m=+99.876498119" watchObservedRunningTime="2026-02-02 10:33:38.7854619 +0000 UTC m=+99.876863390" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785602 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785638 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/948fb07c-2db9-45aa-805b-5ba192aae967-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785694 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/948fb07c-2db9-45aa-805b-5ba192aae967-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.785753 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/948fb07c-2db9-45aa-805b-5ba192aae967-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.786248 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.786705 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/948fb07c-2db9-45aa-805b-5ba192aae967-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.786756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/948fb07c-2db9-45aa-805b-5ba192aae967-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.801841 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/948fb07c-2db9-45aa-805b-5ba192aae967-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.809661 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/948fb07c-2db9-45aa-805b-5ba192aae967-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gwtsw\" (UID: \"948fb07c-2db9-45aa-805b-5ba192aae967\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.838004 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-c2rmk" podStartSLOduration=77.837977366 podStartE2EDuration="1m17.837977366s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.803956792 +0000 UTC m=+99.895358262" watchObservedRunningTime="2026-02-02 10:33:38.837977366 +0000 UTC m=+99.929378846" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.838291 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.838283986 podStartE2EDuration="1m15.838283986s" podCreationTimestamp="2026-02-02 10:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.837290806 +0000 UTC m=+99.928692266" watchObservedRunningTime="2026-02-02 10:33:38.838283986 +0000 UTC m=+99.929685466" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.875747 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.875727704 podStartE2EDuration="1m18.875727704s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.875098005 +0000 UTC m=+99.966499485" watchObservedRunningTime="2026-02-02 10:33:38.875727704 +0000 UTC m=+99.967129164" Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.913378 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" Feb 02 10:33:38 crc kubenswrapper[4845]: W0202 10:33:38.928741 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod948fb07c_2db9_45aa_805b_5ba192aae967.slice/crio-43fd53f2bc5b1f32ce78c098d90a6eda1133a88b8b776b2c364e668f6421d332 WatchSource:0}: Error finding container 43fd53f2bc5b1f32ce78c098d90a6eda1133a88b8b776b2c364e668f6421d332: Status 404 returned error can't find the container with id 43fd53f2bc5b1f32ce78c098d90a6eda1133a88b8b776b2c364e668f6421d332 Feb 02 10:33:38 crc kubenswrapper[4845]: I0202 10:33:38.972789 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=72.972770583 podStartE2EDuration="1m12.972770583s" podCreationTimestamp="2026-02-02 10:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.954414335 +0000 UTC m=+100.045815795" watchObservedRunningTime="2026-02-02 10:33:38.972770583 +0000 UTC m=+100.064172043" Feb 02 10:33:39 crc kubenswrapper[4845]: I0202 10:33:39.327116 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" event={"ID":"948fb07c-2db9-45aa-805b-5ba192aae967","Type":"ContainerStarted","Data":"b85737c6a0c98af12e2cb2b520e560c55c207c7ccfaf7439fb5c84df84ee0147"} Feb 02 10:33:39 crc kubenswrapper[4845]: I0202 10:33:39.327172 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" event={"ID":"948fb07c-2db9-45aa-805b-5ba192aae967","Type":"ContainerStarted","Data":"43fd53f2bc5b1f32ce78c098d90a6eda1133a88b8b776b2c364e668f6421d332"} Feb 02 10:33:39 crc kubenswrapper[4845]: I0202 10:33:39.712464 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:39 crc kubenswrapper[4845]: I0202 10:33:39.712476 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:39 crc kubenswrapper[4845]: I0202 10:33:39.712988 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:39 crc kubenswrapper[4845]: E0202 10:33:39.713372 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:39 crc kubenswrapper[4845]: E0202 10:33:39.713443 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:39 crc kubenswrapper[4845]: E0202 10:33:39.713520 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:40 crc kubenswrapper[4845]: I0202 10:33:40.711766 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:40 crc kubenswrapper[4845]: E0202 10:33:40.713678 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:41 crc kubenswrapper[4845]: I0202 10:33:41.712243 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:41 crc kubenswrapper[4845]: I0202 10:33:41.712264 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:41 crc kubenswrapper[4845]: E0202 10:33:41.712937 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:41 crc kubenswrapper[4845]: E0202 10:33:41.713129 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:41 crc kubenswrapper[4845]: I0202 10:33:41.712334 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:41 crc kubenswrapper[4845]: E0202 10:33:41.713459 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:42 crc kubenswrapper[4845]: I0202 10:33:42.712652 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:42 crc kubenswrapper[4845]: E0202 10:33:42.712871 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:42 crc kubenswrapper[4845]: I0202 10:33:42.714159 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:33:42 crc kubenswrapper[4845]: E0202 10:33:42.714431 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sh5vd_openshift-ovn-kubernetes(7b93b041-3f3f-47ba-a9d4-d09de1b326dc)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" Feb 02 10:33:43 crc kubenswrapper[4845]: I0202 10:33:43.712379 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:43 crc kubenswrapper[4845]: I0202 10:33:43.712454 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:43 crc kubenswrapper[4845]: I0202 10:33:43.712405 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:43 crc kubenswrapper[4845]: E0202 10:33:43.712613 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:43 crc kubenswrapper[4845]: E0202 10:33:43.712744 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:43 crc kubenswrapper[4845]: E0202 10:33:43.712870 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:44 crc kubenswrapper[4845]: I0202 10:33:44.711986 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:44 crc kubenswrapper[4845]: E0202 10:33:44.712214 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:45 crc kubenswrapper[4845]: I0202 10:33:45.713363 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:45 crc kubenswrapper[4845]: I0202 10:33:45.713445 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:45 crc kubenswrapper[4845]: I0202 10:33:45.713470 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:45 crc kubenswrapper[4845]: E0202 10:33:45.713601 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:45 crc kubenswrapper[4845]: E0202 10:33:45.713717 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:45 crc kubenswrapper[4845]: E0202 10:33:45.714292 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:46 crc kubenswrapper[4845]: I0202 10:33:46.711728 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:46 crc kubenswrapper[4845]: E0202 10:33:46.712169 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:47 crc kubenswrapper[4845]: I0202 10:33:47.712641 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:47 crc kubenswrapper[4845]: I0202 10:33:47.712713 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:47 crc kubenswrapper[4845]: E0202 10:33:47.712874 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:47 crc kubenswrapper[4845]: I0202 10:33:47.712936 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:47 crc kubenswrapper[4845]: E0202 10:33:47.713024 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:47 crc kubenswrapper[4845]: E0202 10:33:47.713126 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:48 crc kubenswrapper[4845]: I0202 10:33:48.712494 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:48 crc kubenswrapper[4845]: E0202 10:33:48.713020 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:49 crc kubenswrapper[4845]: I0202 10:33:49.711837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:49 crc kubenswrapper[4845]: I0202 10:33:49.711980 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:49 crc kubenswrapper[4845]: I0202 10:33:49.714759 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:49 crc kubenswrapper[4845]: E0202 10:33:49.714739 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:49 crc kubenswrapper[4845]: E0202 10:33:49.714986 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:49 crc kubenswrapper[4845]: E0202 10:33:49.715098 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:50 crc kubenswrapper[4845]: I0202 10:33:50.712645 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:50 crc kubenswrapper[4845]: E0202 10:33:50.712854 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:51 crc kubenswrapper[4845]: I0202 10:33:51.712050 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:51 crc kubenswrapper[4845]: I0202 10:33:51.712078 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:51 crc kubenswrapper[4845]: E0202 10:33:51.712246 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:51 crc kubenswrapper[4845]: I0202 10:33:51.712325 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:51 crc kubenswrapper[4845]: E0202 10:33:51.712447 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:51 crc kubenswrapper[4845]: E0202 10:33:51.712525 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:52 crc kubenswrapper[4845]: I0202 10:33:52.712440 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:52 crc kubenswrapper[4845]: E0202 10:33:52.712620 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:53 crc kubenswrapper[4845]: I0202 10:33:53.712272 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:53 crc kubenswrapper[4845]: E0202 10:33:53.712630 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:53 crc kubenswrapper[4845]: I0202 10:33:53.712663 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:53 crc kubenswrapper[4845]: E0202 10:33:53.712803 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:53 crc kubenswrapper[4845]: I0202 10:33:53.712850 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:53 crc kubenswrapper[4845]: E0202 10:33:53.712955 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:54 crc kubenswrapper[4845]: I0202 10:33:54.712203 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:54 crc kubenswrapper[4845]: E0202 10:33:54.712359 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.381602 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/1.log" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.382341 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/0.log" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.382396 4845 generic.go:334] "Generic (PLEG): container finished" podID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" containerID="a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f" exitCode=1 Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.382435 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerDied","Data":"a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f"} Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.382484 4845 scope.go:117] "RemoveContainer" containerID="2b7e296528c0b05e1fec124320c451f0dbd9fb5fb76cb9e349019fd2d982de3a" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.383059 4845 scope.go:117] "RemoveContainer" containerID="a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f" Feb 02 10:33:55 crc kubenswrapper[4845]: E0202 10:33:55.385202 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-kzwst_openshift-multus(310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3)\"" pod="openshift-multus/multus-kzwst" podUID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.412013 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gwtsw" podStartSLOduration=95.411994606 podStartE2EDuration="1m35.411994606s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.348617554 +0000 UTC m=+100.440019044" watchObservedRunningTime="2026-02-02 10:33:55.411994606 +0000 UTC m=+116.503396056" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.711956 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.711994 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:55 crc kubenswrapper[4845]: E0202 10:33:55.712084 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:55 crc kubenswrapper[4845]: E0202 10:33:55.712243 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:55 crc kubenswrapper[4845]: I0202 10:33:55.712788 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:55 crc kubenswrapper[4845]: E0202 10:33:55.713244 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:56 crc kubenswrapper[4845]: I0202 10:33:56.388643 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/1.log" Feb 02 10:33:56 crc kubenswrapper[4845]: I0202 10:33:56.711852 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:56 crc kubenswrapper[4845]: E0202 10:33:56.712082 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:56 crc kubenswrapper[4845]: I0202 10:33:56.713262 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.393907 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/3.log" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.396666 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerStarted","Data":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.397276 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.430822 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podStartSLOduration=97.430799539 podStartE2EDuration="1m37.430799539s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:57.430629104 +0000 UTC m=+118.522030594" watchObservedRunningTime="2026-02-02 10:33:57.430799539 +0000 UTC m=+118.522200999" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.531792 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pmn9h"] Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.531919 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:57 crc kubenswrapper[4845]: E0202 10:33:57.532006 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.711675 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.711734 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:57 crc kubenswrapper[4845]: I0202 10:33:57.711795 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:57 crc kubenswrapper[4845]: E0202 10:33:57.712005 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:57 crc kubenswrapper[4845]: E0202 10:33:57.712123 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:57 crc kubenswrapper[4845]: E0202 10:33:57.712270 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.701405 4845 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 02 10:33:59 crc kubenswrapper[4845]: I0202 10:33:59.711835 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.713127 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:59 crc kubenswrapper[4845]: I0202 10:33:59.713205 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:59 crc kubenswrapper[4845]: I0202 10:33:59.713205 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.713303 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:59 crc kubenswrapper[4845]: I0202 10:33:59.713222 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.713410 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.713452 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:59 crc kubenswrapper[4845]: E0202 10:33:59.843972 4845 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:34:01 crc kubenswrapper[4845]: I0202 10:34:01.711956 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:01 crc kubenswrapper[4845]: I0202 10:34:01.711992 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:01 crc kubenswrapper[4845]: I0202 10:34:01.712052 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:01 crc kubenswrapper[4845]: E0202 10:34:01.712125 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:34:01 crc kubenswrapper[4845]: I0202 10:34:01.712227 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:01 crc kubenswrapper[4845]: E0202 10:34:01.712482 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:34:01 crc kubenswrapper[4845]: E0202 10:34:01.712602 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:34:01 crc kubenswrapper[4845]: E0202 10:34:01.712695 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:34:03 crc kubenswrapper[4845]: I0202 10:34:03.712473 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:03 crc kubenswrapper[4845]: I0202 10:34:03.712552 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:03 crc kubenswrapper[4845]: E0202 10:34:03.712673 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:34:03 crc kubenswrapper[4845]: I0202 10:34:03.712752 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:03 crc kubenswrapper[4845]: E0202 10:34:03.712925 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:34:03 crc kubenswrapper[4845]: E0202 10:34:03.713057 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:34:03 crc kubenswrapper[4845]: I0202 10:34:03.713246 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:03 crc kubenswrapper[4845]: E0202 10:34:03.713362 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:34:04 crc kubenswrapper[4845]: E0202 10:34:04.845077 4845 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:34:05 crc kubenswrapper[4845]: I0202 10:34:05.712202 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:05 crc kubenswrapper[4845]: E0202 10:34:05.712616 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:34:05 crc kubenswrapper[4845]: I0202 10:34:05.712693 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:05 crc kubenswrapper[4845]: I0202 10:34:05.712673 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:05 crc kubenswrapper[4845]: I0202 10:34:05.712715 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:05 crc kubenswrapper[4845]: E0202 10:34:05.713113 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:34:05 crc kubenswrapper[4845]: E0202 10:34:05.713370 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:34:05 crc kubenswrapper[4845]: E0202 10:34:05.713526 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:34:06 crc kubenswrapper[4845]: I0202 10:34:06.712366 4845 scope.go:117] "RemoveContainer" containerID="a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f" Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.434052 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/1.log" Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.434512 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerStarted","Data":"276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b"} Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.711613 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.711625 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:07 crc kubenswrapper[4845]: E0202 10:34:07.711852 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.711874 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:07 crc kubenswrapper[4845]: I0202 10:34:07.712005 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:07 crc kubenswrapper[4845]: E0202 10:34:07.712120 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:34:07 crc kubenswrapper[4845]: E0202 10:34:07.712224 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:34:07 crc kubenswrapper[4845]: E0202 10:34:07.712787 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:34:09 crc kubenswrapper[4845]: I0202 10:34:09.712122 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:09 crc kubenswrapper[4845]: I0202 10:34:09.712205 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:09 crc kubenswrapper[4845]: I0202 10:34:09.712217 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:09 crc kubenswrapper[4845]: I0202 10:34:09.715083 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:09 crc kubenswrapper[4845]: E0202 10:34:09.715163 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:34:09 crc kubenswrapper[4845]: E0202 10:34:09.715332 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pmn9h" podUID="84cb7b66-62e7-4012-ab80-7c5e6ba51e35" Feb 02 10:34:09 crc kubenswrapper[4845]: E0202 10:34:09.715017 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:34:09 crc kubenswrapper[4845]: E0202 10:34:09.715473 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.711817 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.711842 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.711954 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.711963 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.716530 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.717002 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.717027 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.717263 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.719918 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:34:11 crc kubenswrapper[4845]: I0202 10:34:11.720252 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:34:17 crc kubenswrapper[4845]: I0202 10:34:17.849534 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.858554 4845 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.909372 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.910785 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.916502 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk8gn"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.923551 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.923980 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.927002 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.945837 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.946412 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.946912 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.947811 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xsdsh"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.948542 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mvp5t"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.948835 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.948616 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.949690 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.950445 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.951110 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.952319 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.955503 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4rbqr"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.956163 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.956617 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.958362 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.958583 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.958600 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.958369 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.958651 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.961566 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.961671 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.961772 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.962134 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.962411 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.962456 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.962746 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.963102 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.969323 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6szh7"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.969848 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.970759 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.970971 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.971118 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.971276 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.971377 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.971471 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.975185 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.975779 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.975991 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.976223 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.977302 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.977973 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978116 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978122 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978274 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978364 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978471 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978508 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978210 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978688 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978477 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.978237 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.979071 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.980076 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.980206 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.980351 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.980586 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.980832 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.981580 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.983352 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.984139 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.984339 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.984638 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.984780 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.984938 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.985475 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.985845 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.986059 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.986208 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.986206 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.986406 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.986566 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.987088 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.987284 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.987470 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.987811 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.989124 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8"] Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.989417 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990036 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990118 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990309 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990561 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990706 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.990866 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.991068 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:34:19 crc kubenswrapper[4845]: I0202 10:34:19.991247 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.015994 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.016177 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.020826 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.021139 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.024115 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.024214 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.025088 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.032624 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.033167 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.033551 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.033610 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.033820 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034019 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034193 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034267 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034324 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034422 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.034511 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.035255 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.035778 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.035904 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.036350 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.039072 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.040625 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.040777 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.040806 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.041178 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.041208 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.041103 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.041462 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.042008 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.042063 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.042493 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.042575 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pt97w"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.043061 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.043397 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044024 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044078 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044116 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044258 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044272 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044339 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044404 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044451 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044525 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044605 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.044826 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.052473 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054229 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054264 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-serving-cert\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054291 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-trusted-ca\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054315 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-node-pullsecrets\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054338 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-config\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054377 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054404 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9rv\" (UniqueName: \"kubernetes.io/projected/66ae9a2f-1c24-4a65-b961-bd9431c667f6-kube-api-access-fk9rv\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054425 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm6b8\" (UniqueName: \"kubernetes.io/projected/c29a9366-4664-4228-af51-b56b63c976b6-kube-api-access-pm6b8\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054451 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054473 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr6md\" (UniqueName: \"kubernetes.io/projected/6bf70521-8fdf-400f-b7cd-d96b609b4783-kube-api-access-kr6md\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054495 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054521 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-encryption-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054542 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw74n\" (UniqueName: \"kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054566 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-image-import-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054583 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c29a9366-4664-4228-af51-b56b63c976b6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054603 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-config\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054641 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-serving-cert\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054661 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-images\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054682 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-service-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054716 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit-dir\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054739 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbv5\" (UniqueName: \"kubernetes.io/projected/f8a6e66d-c97e-43eb-8ff0-864e543f5488-kube-api-access-xwbv5\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054757 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d02c35e0-ade4-4316-b4da-88c6dd349220-machine-approver-tls\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054783 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl77h\" (UniqueName: \"kubernetes.io/projected/611eb8a8-5fc3-4325-96be-1dba1144259b-kube-api-access-cl77h\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054804 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8a6e66d-c97e-43eb-8ff0-864e543f5488-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054850 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4zm2\" (UniqueName: \"kubernetes.io/projected/d02c35e0-ade4-4316-b4da-88c6dd349220-kube-api-access-h4zm2\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054881 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054923 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054940 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-serving-cert\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054962 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gh4\" (UniqueName: \"kubernetes.io/projected/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-kube-api-access-x9gh4\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054993 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bf70521-8fdf-400f-b7cd-d96b609b4783-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055036 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822n9\" (UniqueName: \"kubernetes.io/projected/a70e2a3d-9afe-4437-b9ef-fe175eee93d6-kube-api-access-822n9\") pod \"downloads-7954f5f757-4rbqr\" (UID: \"a70e2a3d-9afe-4437-b9ef-fe175eee93d6\") " pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055077 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055082 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkq4d\" (UniqueName: \"kubernetes.io/projected/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-kube-api-access-jkq4d\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055537 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-auth-proxy-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055603 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611eb8a8-5fc3-4325-96be-1dba1144259b-serving-cert\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.054952 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rbhk2"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-config\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-client\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.055962 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c29a9366-4664-4228-af51-b56b63c976b6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.056318 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.056423 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.057148 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.057427 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.058539 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vcpg"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.058706 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.058921 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059098 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mvp5t"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059179 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059296 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059336 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059451 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059469 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.059179 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.060149 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.060376 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.064936 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk8gn"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.071385 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.072061 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.072973 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.076045 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.078542 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.086084 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.087834 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.091326 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.092786 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4rbqr"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.094339 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.095862 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rvxdk"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.096105 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.096687 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.096839 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.096948 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.099732 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.100554 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.100814 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.104858 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vq42n"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.105902 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fz66j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.108161 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.116466 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.125660 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.126237 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.126348 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.126627 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.135557 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.135993 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.136966 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.138339 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.139089 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.139262 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wc6bh"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.139610 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.144118 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.155246 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pt97w"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.156773 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5d758a8-6722-4c1b-be56-fe2bb6d27830-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.156918 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d758a8-6722-4c1b-be56-fe2bb6d27830-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157005 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157102 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-image-import-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157197 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-config\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157284 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157366 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157458 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157538 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157617 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7dd651-1a0c-43b7-8c52-525200a7146c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157699 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157789 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-serving-cert\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158286 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-service-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158378 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ab0690-627e-4d43-b80c-3b3f96b06249-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157168 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158514 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158470 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157489 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158579 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5bcf9211-edc2-4706-a9ac-b5f38b856186-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158608 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-srv-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158657 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7blj\" (UniqueName: \"kubernetes.io/projected/1b5822e3-ccfc-4261-9a39-8f02356add90-kube-api-access-l7blj\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee590ca4-c2f6-4dcf-973d-df26701d689f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158714 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl77h\" (UniqueName: \"kubernetes.io/projected/611eb8a8-5fc3-4325-96be-1dba1144259b-kube-api-access-cl77h\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158736 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81ab0690-627e-4d43-b80c-3b3f96b06249-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158760 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158780 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrd2n\" (UniqueName: \"kubernetes.io/projected/0bc78651-e3a0-4988-acfa-89a6391f4aa5-kube-api-access-qrd2n\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158412 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-config\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158800 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.157980 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-image-import-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158826 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158919 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158941 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158945 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-service-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158959 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bf70521-8fdf-400f-b7cd-d96b609b4783-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158978 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5822e3-ccfc-4261-9a39-8f02356add90-proxy-tls\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.158996 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-policies\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159424 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159482 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159511 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d758a8-6722-4c1b-be56-fe2bb6d27830-config\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159533 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8phvb\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-kube-api-access-8phvb\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159509 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159585 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e903551f-3d78-4de4-a08a-ce9ea234942c-metrics-tls\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159623 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-config\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159641 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7dd651-1a0c-43b7-8c52-525200a7146c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159677 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159707 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-client\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.159738 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c29a9366-4664-4228-af51-b56b63c976b6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160195 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czplh\" (UniqueName: \"kubernetes.io/projected/3ad81540-c66f-4f41-98a5-12aa607142fd-kube-api-access-czplh\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160333 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160413 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-config\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160464 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c29a9366-4664-4228-af51-b56b63c976b6-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160345 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160667 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-srv-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160772 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.160874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-serving-cert\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161026 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-client\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8291b32a-8322-4027-af13-cd9f10390406-service-ca-bundle\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161248 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-node-pullsecrets\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161321 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-node-pullsecrets\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161429 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161455 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vcpg"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161537 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161599 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-default-certificate\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngtx\" (UniqueName: \"kubernetes.io/projected/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-kube-api-access-7ngtx\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm6b8\" (UniqueName: \"kubernetes.io/projected/c29a9366-4664-4228-af51-b56b63c976b6-kube-api-access-pm6b8\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.161897 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162151 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-available-featuregates\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162238 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9rv\" (UniqueName: \"kubernetes.io/projected/66ae9a2f-1c24-4a65-b961-bd9431c667f6-kube-api-access-fk9rv\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162294 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162317 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr6md\" (UniqueName: \"kubernetes.io/projected/6bf70521-8fdf-400f-b7cd-d96b609b4783-kube-api-access-kr6md\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162335 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162374 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv2kj\" (UniqueName: \"kubernetes.io/projected/8dbf6657-96c2-472f-9e4c-0745a4c249be-kube-api-access-pv2kj\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-encryption-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162465 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ab0690-627e-4d43-b80c-3b3f96b06249-config\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.162849 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163075 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-serving-ca\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163164 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163354 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstpx\" (UniqueName: \"kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163382 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-metrics-certs\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163420 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163588 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-etcd-client\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163710 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6bf70521-8fdf-400f-b7cd-d96b609b4783-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163795 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw74n\" (UniqueName: \"kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163825 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c29a9366-4664-4228-af51-b56b63c976b6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164190 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-serving-cert\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.163844 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-dir\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164433 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-images\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164453 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmpd\" (UniqueName: \"kubernetes.io/projected/5bcf9211-edc2-4706-a9ac-b5f38b856186-kube-api-access-hkmpd\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164872 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.164468 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.165422 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.165736 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6bf70521-8fdf-400f-b7cd-d96b609b4783-images\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.166051 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c29a9366-4664-4228-af51-b56b63c976b6-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.167205 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66ae9a2f-1c24-4a65-b961-bd9431c667f6-encryption-config\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.167258 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.168827 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.169674 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mgqnl"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.170231 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-serving-cert\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.170265 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.171130 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f2fvl"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.171858 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.165436 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q45wk\" (UniqueName: \"kubernetes.io/projected/d1f9a812-d62e-44ca-b83f-5f240ede92a0-kube-api-access-q45wk\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.172989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a7dd651-1a0c-43b7-8c52-525200a7146c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173026 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbv5\" (UniqueName: \"kubernetes.io/projected/f8a6e66d-c97e-43eb-8ff0-864e543f5488-kube-api-access-xwbv5\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173068 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit-dir\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173143 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d02c35e0-ade4-4316-b4da-88c6dd349220-machine-approver-tls\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173176 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-service-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173202 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5bcf9211-edc2-4706-a9ac-b5f38b856186-proxy-tls\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173236 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8w4\" (UniqueName: \"kubernetes.io/projected/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-kube-api-access-7l8w4\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173296 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-images\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173325 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6x5c\" (UniqueName: \"kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173357 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8a6e66d-c97e-43eb-8ff0-864e543f5488-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173396 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit-dir\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173457 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173509 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173547 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-encryption-config\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173729 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173824 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4zm2\" (UniqueName: \"kubernetes.io/projected/d02c35e0-ade4-4316-b4da-88c6dd349220-kube-api-access-h4zm2\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173847 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173872 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-serving-cert\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173909 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gh4\" (UniqueName: \"kubernetes.io/projected/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-kube-api-access-x9gh4\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173954 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee590ca4-c2f6-4dcf-973d-df26701d689f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.173988 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-config\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174080 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-822n9\" (UniqueName: \"kubernetes.io/projected/a70e2a3d-9afe-4437-b9ef-fe175eee93d6-kube-api-access-822n9\") pod \"downloads-7954f5f757-4rbqr\" (UID: \"a70e2a3d-9afe-4437-b9ef-fe175eee93d6\") " pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174119 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174145 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174174 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkq4d\" (UniqueName: \"kubernetes.io/projected/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-kube-api-access-jkq4d\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174199 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174224 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174257 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-auth-proxy-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174279 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611eb8a8-5fc3-4325-96be-1dba1144259b-serving-cert\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174301 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174329 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrcc7\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-kube-api-access-rrcc7\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174352 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174374 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e903551f-3d78-4de4-a08a-ce9ea234942c-trusted-ca\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174431 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174455 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2jq\" (UniqueName: \"kubernetes.io/projected/8291b32a-8322-4027-af13-cd9f10390406-kube-api-access-8r2jq\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174492 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-serving-cert\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174512 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1f9a812-d62e-44ca-b83f-5f240ede92a0-metrics-tls\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174538 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-trusted-ca\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174563 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-config\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174605 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-serving-cert\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174626 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174646 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-stats-auth\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174677 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174703 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-client\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.174930 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.175718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-config\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.176363 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.176472 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/66ae9a2f-1c24-4a65-b961-bd9431c667f6-audit\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.176537 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-serving-cert\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.177134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/611eb8a8-5fc3-4325-96be-1dba1144259b-trusted-ca\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.177414 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.177800 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d02c35e0-ade4-4316-b4da-88c6dd349220-machine-approver-tls\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.177930 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d02c35e0-ade4-4316-b4da-88c6dd349220-auth-proxy-config\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.178406 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/611eb8a8-5fc3-4325-96be-1dba1144259b-serving-cert\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.178937 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8a6e66d-c97e-43eb-8ff0-864e543f5488-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.179795 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.180939 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.182107 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.183293 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.191665 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.192836 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.194292 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.195826 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.199767 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.201193 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.202291 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xsdsh"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.203358 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.204468 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f2fvl"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.205512 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rvxdk"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.207256 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vq42n"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.208347 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fz66j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.209407 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.210425 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.211452 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.212445 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6szh7"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.212493 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.213425 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsz25"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.214561 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mgqnl"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.214722 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.215425 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsz25"] Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.233246 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.252751 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.272811 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275600 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-service-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275640 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5bcf9211-edc2-4706-a9ac-b5f38b856186-proxy-tls\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275664 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8w4\" (UniqueName: \"kubernetes.io/projected/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-kube-api-access-7l8w4\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275689 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-images\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275739 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6x5c\" (UniqueName: \"kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275761 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-encryption-config\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275790 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275820 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee590ca4-c2f6-4dcf-973d-df26701d689f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275851 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-config\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.275996 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276023 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276192 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276234 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276261 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrcc7\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-kube-api-access-rrcc7\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276282 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276306 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e903551f-3d78-4de4-a08a-ce9ea234942c-trusted-ca\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276331 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276356 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2jq\" (UniqueName: \"kubernetes.io/projected/8291b32a-8322-4027-af13-cd9f10390406-kube-api-access-8r2jq\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276376 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-serving-cert\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276396 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1f9a812-d62e-44ca-b83f-5f240ede92a0-metrics-tls\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-serving-cert\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276437 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276458 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-stats-auth\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276481 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276505 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-client\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276527 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5d758a8-6722-4c1b-be56-fe2bb6d27830-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276547 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d758a8-6722-4c1b-be56-fe2bb6d27830-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276567 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276591 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276614 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276636 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7dd651-1a0c-43b7-8c52-525200a7146c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276661 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276686 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276711 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ab0690-627e-4d43-b80c-3b3f96b06249-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276737 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276762 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5bcf9211-edc2-4706-a9ac-b5f38b856186-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276792 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-srv-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276812 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276836 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7blj\" (UniqueName: \"kubernetes.io/projected/1b5822e3-ccfc-4261-9a39-8f02356add90-kube-api-access-l7blj\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee590ca4-c2f6-4dcf-973d-df26701d689f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276901 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81ab0690-627e-4d43-b80c-3b3f96b06249-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276934 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276957 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrd2n\" (UniqueName: \"kubernetes.io/projected/0bc78651-e3a0-4988-acfa-89a6391f4aa5-kube-api-access-qrd2n\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.276977 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277001 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5822e3-ccfc-4261-9a39-8f02356add90-proxy-tls\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277021 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-policies\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277041 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d758a8-6722-4c1b-be56-fe2bb6d27830-config\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277083 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8phvb\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-kube-api-access-8phvb\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277117 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7dd651-1a0c-43b7-8c52-525200a7146c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e903551f-3d78-4de4-a08a-ce9ea234942c-metrics-tls\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277189 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czplh\" (UniqueName: \"kubernetes.io/projected/3ad81540-c66f-4f41-98a5-12aa607142fd-kube-api-access-czplh\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277210 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277212 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277229 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-srv-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277272 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-client\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277294 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8291b32a-8322-4027-af13-cd9f10390406-service-ca-bundle\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277290 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277336 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-default-certificate\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277356 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngtx\" (UniqueName: \"kubernetes.io/projected/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-kube-api-access-7ngtx\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277391 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv2kj\" (UniqueName: \"kubernetes.io/projected/8dbf6657-96c2-472f-9e4c-0745a4c249be-kube-api-access-pv2kj\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277710 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ab0690-627e-4d43-b80c-3b3f96b06249-config\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstpx\" (UniqueName: \"kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-metrics-certs\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277802 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-dir\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277852 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmpd\" (UniqueName: \"kubernetes.io/projected/5bcf9211-edc2-4706-a9ac-b5f38b856186-kube-api-access-hkmpd\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277905 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q45wk\" (UniqueName: \"kubernetes.io/projected/d1f9a812-d62e-44ca-b83f-5f240ede92a0-kube-api-access-q45wk\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.277924 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a7dd651-1a0c-43b7-8c52-525200a7146c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.278257 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-policies\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.278344 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.278768 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8dbf6657-96c2-472f-9e4c-0745a4c249be-audit-dir\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.279116 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7dd651-1a0c-43b7-8c52-525200a7146c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.279470 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.279716 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.279864 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.280389 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.280690 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281184 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.278255 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281223 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-encryption-config\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281472 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-auth-proxy-config\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281594 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee590ca4-c2f6-4dcf-973d-df26701d689f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.281806 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbf6657-96c2-472f-9e4c-0745a4c249be-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5bcf9211-edc2-4706-a9ac-b5f38b856186-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282305 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282401 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282688 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282766 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-serving-cert\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.282943 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee590ca4-c2f6-4dcf-973d-df26701d689f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.283148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a7dd651-1a0c-43b7-8c52-525200a7146c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.283349 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.283711 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.284203 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.284839 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8dbf6657-96c2-472f-9e4c-0745a4c249be-etcd-client\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.284933 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5d758a8-6722-4c1b-be56-fe2bb6d27830-config\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.285233 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e903551f-3d78-4de4-a08a-ce9ea234942c-metrics-tls\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.285509 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.286215 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e903551f-3d78-4de4-a08a-ce9ea234942c-trusted-ca\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.286457 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.286641 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5d758a8-6722-4c1b-be56-fe2bb6d27830-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.287370 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.292973 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.313370 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.332690 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.352749 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.363184 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-serving-cert\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.372875 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.380407 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-client\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.393483 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.413733 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.417047 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-config\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.433318 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.438044 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.453499 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.457027 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-etcd-service-ca\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.493070 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.513132 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.520302 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5bcf9211-edc2-4706-a9ac-b5f38b856186-proxy-tls\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.532963 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.541673 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-default-certificate\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.553457 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.561040 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-metrics-certs\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.572318 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.595613 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.599642 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8291b32a-8322-4027-af13-cd9f10390406-service-ca-bundle\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.613244 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.623163 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8291b32a-8322-4027-af13-cd9f10390406-stats-auth\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.632934 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.654546 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.673669 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.694879 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.706175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81ab0690-627e-4d43-b80c-3b3f96b06249-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.713536 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.733238 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.753608 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.761651 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.773750 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.793698 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.813813 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.822338 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.833181 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.837706 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1b5822e3-ccfc-4261-9a39-8f02356add90-images\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.853991 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.874094 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.883814 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5822e3-ccfc-4261-9a39-8f02356add90-proxy-tls\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.893815 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.913450 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.934116 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.940447 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ab0690-627e-4d43-b80c-3b3f96b06249-config\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.954111 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.961417 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1f9a812-d62e-44ca-b83f-5f240ede92a0-metrics-tls\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.973807 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:34:20 crc kubenswrapper[4845]: I0202 10:34:20.993403 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.014049 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.033865 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.053442 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.071998 4845 request.go:700] Waited for 1.011531519s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.073702 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.093787 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.114209 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.126623 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-srv-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.133600 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.142843 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-profile-collector-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.143154 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0bc78651-e3a0-4988-acfa-89a6391f4aa5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.153675 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.177294 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.213990 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.223788 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3ad81540-c66f-4f41-98a5-12aa607142fd-srv-cert\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.233166 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.253460 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.273036 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.293782 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.313634 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.346855 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.353534 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.373757 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.394175 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.413865 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.434126 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.453928 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.473309 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.493219 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.513668 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.533879 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.554573 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.573273 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.594293 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.613730 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.633596 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.655675 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.674071 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.694804 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.713601 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.734353 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.754570 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.775460 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.793941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.829620 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl77h\" (UniqueName: \"kubernetes.io/projected/611eb8a8-5fc3-4325-96be-1dba1144259b-kube-api-access-cl77h\") pod \"console-operator-58897d9998-6szh7\" (UID: \"611eb8a8-5fc3-4325-96be-1dba1144259b\") " pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.859327 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9rv\" (UniqueName: \"kubernetes.io/projected/66ae9a2f-1c24-4a65-b961-bd9431c667f6-kube-api-access-fk9rv\") pod \"apiserver-76f77b778f-xsdsh\" (UID: \"66ae9a2f-1c24-4a65-b961-bd9431c667f6\") " pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.887566 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr6md\" (UniqueName: \"kubernetes.io/projected/6bf70521-8fdf-400f-b7cd-d96b609b4783-kube-api-access-kr6md\") pod \"machine-api-operator-5694c8668f-xk8gn\" (UID: \"6bf70521-8fdf-400f-b7cd-d96b609b4783\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.899438 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm6b8\" (UniqueName: \"kubernetes.io/projected/c29a9366-4664-4228-af51-b56b63c976b6-kube-api-access-pm6b8\") pod \"openshift-apiserver-operator-796bbdcf4f-rch6p\" (UID: \"c29a9366-4664-4228-af51-b56b63c976b6\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.914668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw74n\" (UniqueName: \"kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n\") pod \"route-controller-manager-6576b87f9c-jvc49\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.915932 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.928424 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.932918 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.953680 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.973001 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:34:21 crc kubenswrapper[4845]: I0202 10:34:21.993045 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.012793 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.035612 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.062237 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.069951 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbv5\" (UniqueName: \"kubernetes.io/projected/f8a6e66d-c97e-43eb-8ff0-864e543f5488-kube-api-access-xwbv5\") pod \"cluster-samples-operator-665b6dd947-8btfx\" (UID: \"f8a6e66d-c97e-43eb-8ff0-864e543f5488\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.079783 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.091613 4845 request.go:700] Waited for 1.916291334s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.094732 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkq4d\" (UniqueName: \"kubernetes.io/projected/6de6b4aa-d335-4eb0-b880-7a21c9336ebf-kube-api-access-jkq4d\") pod \"openshift-config-operator-7777fb866f-x9pr7\" (UID: \"6de6b4aa-d335-4eb0-b880-7a21c9336ebf\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.116631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gh4\" (UniqueName: \"kubernetes.io/projected/1eb84a90-bb90-4b5f-9e30-7415cb27cd39-kube-api-access-x9gh4\") pod \"authentication-operator-69f744f599-mvp5t\" (UID: \"1eb84a90-bb90-4b5f-9e30-7415cb27cd39\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.126679 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.127025 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4zm2\" (UniqueName: \"kubernetes.io/projected/d02c35e0-ade4-4316-b4da-88c6dd349220-kube-api-access-h4zm2\") pod \"machine-approver-56656f9798-bl6zg\" (UID: \"d02c35e0-ade4-4316-b4da-88c6dd349220\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.134826 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6szh7"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.153185 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.153326 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.159177 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-822n9\" (UniqueName: \"kubernetes.io/projected/a70e2a3d-9afe-4437-b9ef-fe175eee93d6-kube-api-access-822n9\") pod \"downloads-7954f5f757-4rbqr\" (UID: \"a70e2a3d-9afe-4437-b9ef-fe175eee93d6\") " pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.162498 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.173104 4845 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.174369 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.187119 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.193209 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.197183 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.210540 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.229191 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8w4\" (UniqueName: \"kubernetes.io/projected/2b15a9aa-d443-4058-8e02-0f0eafbd7dd9-kube-api-access-7l8w4\") pod \"etcd-operator-b45778765-pt97w\" (UID: \"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9\") " pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.248727 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6x5c\" (UniqueName: \"kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c\") pod \"oauth-openshift-558db77b4-w989s\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.270596 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xsdsh"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.276386 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.278500 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81ab0690-627e-4d43-b80c-3b3f96b06249-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-25kbf\" (UID: \"81ab0690-627e-4d43-b80c-3b3f96b06249\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.294143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrd2n\" (UniqueName: \"kubernetes.io/projected/0bc78651-e3a0-4988-acfa-89a6391f4aa5-kube-api-access-qrd2n\") pod \"olm-operator-6b444d44fb-t62rp\" (UID: \"0bc78651-e3a0-4988-acfa-89a6391f4aa5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.303140 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.306128 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2jq\" (UniqueName: \"kubernetes.io/projected/8291b32a-8322-4027-af13-cd9f10390406-kube-api-access-8r2jq\") pod \"router-default-5444994796-rbhk2\" (UID: \"8291b32a-8322-4027-af13-cd9f10390406\") " pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.330117 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.330918 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngtx\" (UniqueName: \"kubernetes.io/projected/aa7bf903-1f8f-4d7c-b5a1-33a07160500f-kube-api-access-7ngtx\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfh2j\" (UID: \"aa7bf903-1f8f-4d7c-b5a1-33a07160500f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.346123 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.361802 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xk8gn"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.361810 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.368796 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3a7dd651-1a0c-43b7-8c52-525200a7146c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-f4gzw\" (UID: \"3a7dd651-1a0c-43b7-8c52-525200a7146c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.373254 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv2kj\" (UniqueName: \"kubernetes.io/projected/8dbf6657-96c2-472f-9e4c-0745a4c249be-kube-api-access-pv2kj\") pod \"apiserver-7bbb656c7d-49vf9\" (UID: \"8dbf6657-96c2-472f-9e4c-0745a4c249be\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.389305 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstpx\" (UniqueName: \"kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx\") pod \"console-f9d7485db-8gjpm\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.406708 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.410838 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.426602 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmpd\" (UniqueName: \"kubernetes.io/projected/5bcf9211-edc2-4706-a9ac-b5f38b856186-kube-api-access-hkmpd\") pod \"machine-config-controller-84d6567774-qx7sb\" (UID: \"5bcf9211-edc2-4706-a9ac-b5f38b856186\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.429503 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:22 crc kubenswrapper[4845]: W0202 10:34:22.451235 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8291b32a_8322_4027_af13_cd9f10390406.slice/crio-2739cd6071c9c851dfa234cb94024efbbd69ed061825d9ea4a08b2cf5cd70d07 WatchSource:0}: Error finding container 2739cd6071c9c851dfa234cb94024efbbd69ed061825d9ea4a08b2cf5cd70d07: Status 404 returned error can't find the container with id 2739cd6071c9c851dfa234cb94024efbbd69ed061825d9ea4a08b2cf5cd70d07 Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.452486 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrcc7\" (UniqueName: \"kubernetes.io/projected/ee590ca4-c2f6-4dcf-973d-df26701d689f-kube-api-access-rrcc7\") pod \"cluster-image-registry-operator-dc59b4c8b-m55h8\" (UID: \"ee590ca4-c2f6-4dcf-973d-df26701d689f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.468935 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czplh\" (UniqueName: \"kubernetes.io/projected/3ad81540-c66f-4f41-98a5-12aa607142fd-kube-api-access-czplh\") pod \"catalog-operator-68c6474976-zznfs\" (UID: \"3ad81540-c66f-4f41-98a5-12aa607142fd\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.487307 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8phvb\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-kube-api-access-8phvb\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.496981 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.498826 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6szh7" event={"ID":"611eb8a8-5fc3-4325-96be-1dba1144259b","Type":"ContainerStarted","Data":"f5fc2b79ee1a35c707a25e545423c9c4a81a43e2d7e74219382e255e6642c953"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.498854 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6szh7" event={"ID":"611eb8a8-5fc3-4325-96be-1dba1144259b","Type":"ContainerStarted","Data":"3b01c61a80a2a1ddd99e79d54045acec2e6a2a588ce464b85b6d9784ba96ab8a"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.499534 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.500159 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" event={"ID":"6bf70521-8fdf-400f-b7cd-d96b609b4783","Type":"ContainerStarted","Data":"1376156431a753612df059d26a63e0f28fa28d2f0b741c60818c98e052d0836b"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.500715 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rbhk2" event={"ID":"8291b32a-8322-4027-af13-cd9f10390406","Type":"ContainerStarted","Data":"2739cd6071c9c851dfa234cb94024efbbd69ed061825d9ea4a08b2cf5cd70d07"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.505326 4845 patch_prober.go:28] interesting pod/console-operator-58897d9998-6szh7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.505371 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6szh7" podUID="611eb8a8-5fc3-4325-96be-1dba1144259b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/readyz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.507510 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e903551f-3d78-4de4-a08a-ce9ea234942c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-h92cr\" (UID: \"e903551f-3d78-4de4-a08a-ce9ea234942c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.513531 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" event={"ID":"ac875c91-285a-420b-9065-50af53ab50d3","Type":"ContainerStarted","Data":"dca4acc312ecd37056dbc4edd5440def5a4b22eb4ea478d220c28a8e7aa4f810"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.528764 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" event={"ID":"66ae9a2f-1c24-4a65-b961-bd9431c667f6","Type":"ContainerStarted","Data":"439d03444803cd71a625bdc8afb36c9934af3b183281b8357bbd2b8cdba5e60d"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.530016 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" event={"ID":"d02c35e0-ade4-4316-b4da-88c6dd349220","Type":"ContainerStarted","Data":"6da09c3a2ec613197b31bce44f495dfc90eebc7f5768603b22daa847db1f963b"} Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.534759 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q45wk\" (UniqueName: \"kubernetes.io/projected/d1f9a812-d62e-44ca-b83f-5f240ede92a0-kube-api-access-q45wk\") pod \"dns-operator-744455d44c-2vcpg\" (UID: \"d1f9a812-d62e-44ca-b83f-5f240ede92a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.549116 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4rbqr"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.551045 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.552894 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7blj\" (UniqueName: \"kubernetes.io/projected/1b5822e3-ccfc-4261-9a39-8f02356add90-kube-api-access-l7blj\") pod \"machine-config-operator-74547568cd-m9h2g\" (UID: \"1b5822e3-ccfc-4261-9a39-8f02356add90\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.564488 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.583917 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5d758a8-6722-4c1b-be56-fe2bb6d27830-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qmvcn\" (UID: \"b5d758a8-6722-4c1b-be56-fe2bb6d27830\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.589260 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.598051 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.601531 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7"] Feb 02 10:34:22 crc kubenswrapper[4845]: W0202 10:34:22.604247 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda70e2a3d_9afe_4437_b9ef_fe175eee93d6.slice/crio-7479615bc5acb0d95dc873a7b6422c041138cc9e9f9895eb879626d491a81011 WatchSource:0}: Error finding container 7479615bc5acb0d95dc873a7b6422c041138cc9e9f9895eb879626d491a81011: Status 404 returned error can't find the container with id 7479615bc5acb0d95dc873a7b6422c041138cc9e9f9895eb879626d491a81011 Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.610231 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.614987 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgm6\" (UniqueName: \"kubernetes.io/projected/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-kube-api-access-lpgm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615095 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xzkk\" (UniqueName: \"kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615111 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615190 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615259 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615298 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615331 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615348 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g77qx\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615384 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615403 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615420 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615484 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615508 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.615550 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.615933 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.115922586 +0000 UTC m=+144.207324036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.616316 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" Feb 02 10:34:22 crc kubenswrapper[4845]: W0202 10:34:22.632070 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6de6b4aa_d335_4eb0_b880_7a21c9336ebf.slice/crio-11ea871b8e34a0eef6934dd0d4b800fe4ce3f391c95fdd680371c20dfc11e944 WatchSource:0}: Error finding container 11ea871b8e34a0eef6934dd0d4b800fe4ce3f391c95fdd680371c20dfc11e944: Status 404 returned error can't find the container with id 11ea871b8e34a0eef6934dd0d4b800fe4ce3f391c95fdd680371c20dfc11e944 Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.637996 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.644245 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mvp5t"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.654739 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.655424 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.670942 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.681651 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p"] Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.717503 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.217470319 +0000 UTC m=+144.308871769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.717545 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718038 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82fb9557-bbfb-42e4-ba6c-522685082e66-cert\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718298 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hl8d\" (UniqueName: \"kubernetes.io/projected/82fb9557-bbfb-42e4-ba6c-522685082e66-kube-api-access-2hl8d\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718382 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xghn\" (UniqueName: \"kubernetes.io/projected/54f66031-6300-4334-8a24-bfe02897b467-kube-api-access-7xghn\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718485 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xzkk\" (UniqueName: \"kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718518 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718549 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j97kh\" (UniqueName: \"kubernetes.io/projected/722bda9f-5a8b-4c83-8b1f-790da0003ce9-kube-api-access-j97kh\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718618 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718650 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-csi-data-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718685 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718738 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718775 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a47109aa-f36b-4a01-89d4-832ff0a7a700-serving-cert\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718870 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.718921 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-mountpoint-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719020 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719095 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719120 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g77qx\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719192 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7a64280-2fd1-4149-826a-1f0daed66dc1-config-volume\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719252 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719282 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-plugins-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719368 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719463 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f295e287-05b6-45e1-bfd5-3c71d7a87f15-tmpfs\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719494 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-cabundle\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719544 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-certs\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719626 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlq47\" (UniqueName: \"kubernetes.io/projected/be245fb2-4ef3-4642-aae0-14954ab28ffa-kube-api-access-wlq47\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719685 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719713 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47109aa-f36b-4a01-89d4-832ff0a7a700-config\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719743 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be245fb2-4ef3-4642-aae0-14954ab28ffa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.719772 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xnxk\" (UniqueName: \"kubernetes.io/projected/f295e287-05b6-45e1-bfd5-3c71d7a87f15-kube-api-access-9xnxk\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.721429 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.722200 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.722602 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.724414 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.725927 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.725972 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x654k\" (UniqueName: \"kubernetes.io/projected/a47109aa-f36b-4a01-89d4-832ff0a7a700-kube-api-access-x654k\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726074 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7a64280-2fd1-4149-826a-1f0daed66dc1-metrics-tls\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726216 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-node-bootstrap-token\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726292 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726497 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/722bda9f-5a8b-4c83-8b1f-790da0003ce9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726661 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.726685 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.727489 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.227475479 +0000 UTC m=+144.318876929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.727547 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cqbz\" (UniqueName: \"kubernetes.io/projected/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-kube-api-access-6cqbz\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.727577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-apiservice-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.727780 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.727937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728179 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728303 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x7f5\" (UniqueName: \"kubernetes.io/projected/b7a64280-2fd1-4149-826a-1f0daed66dc1-kube-api-access-9x7f5\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728536 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728608 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-socket-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728661 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk7lz\" (UniqueName: \"kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.728751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfmb7\" (UniqueName: \"kubernetes.io/projected/6bedd3e9-4212-4b9b-866a-a473d7f1c632-kube-api-access-tfmb7\") pod \"migrator-59844c95c7-r92c5\" (UID: \"6bedd3e9-4212-4b9b-866a-a473d7f1c632\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.729305 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-webhook-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730170 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-registration-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730235 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730302 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6jd9\" (UniqueName: \"kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-key\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730389 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730595 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mhrk\" (UniqueName: \"kubernetes.io/projected/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-kube-api-access-9mhrk\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730654 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chjf2\" (UniqueName: \"kubernetes.io/projected/028dfe05-0d8f-4d6f-b5f4-af641b911b52-kube-api-access-chjf2\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.730676 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgm6\" (UniqueName: \"kubernetes.io/projected/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-kube-api-access-lpgm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.732652 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.734493 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.742582 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.753929 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.764451 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.764731 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-pt97w"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.772541 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g77qx\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.787345 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp"] Feb 02 10:34:22 crc kubenswrapper[4845]: W0202 10:34:22.792917 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29a9366_4664_4228_af51_b56b63c976b6.slice/crio-f729fd0ab07975c64a8bf46c2b75ee8ebbbb431631f9eabfbd59b576e27c02a4 WatchSource:0}: Error finding container f729fd0ab07975c64a8bf46c2b75ee8ebbbb431631f9eabfbd59b576e27c02a4: Status 404 returned error can't find the container with id f729fd0ab07975c64a8bf46c2b75ee8ebbbb431631f9eabfbd59b576e27c02a4 Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.805789 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xzkk\" (UniqueName: \"kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk\") pod \"controller-manager-879f6c89f-z9qjh\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.817361 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgm6\" (UniqueName: \"kubernetes.io/projected/daa4c1cf-5cd2-4dba-8ddb-543a716a4628-kube-api-access-lpgm6\") pod \"kube-storage-version-migrator-operator-b67b599dd-ns95h\" (UID: \"daa4c1cf-5cd2-4dba-8ddb-543a716a4628\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831400 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831600 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7a64280-2fd1-4149-826a-1f0daed66dc1-config-volume\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-plugins-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.831655 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.331632138 +0000 UTC m=+144.423033588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831803 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831820 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-plugins-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831843 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f295e287-05b6-45e1-bfd5-3c71d7a87f15-tmpfs\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831863 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-cabundle\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831902 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-certs\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831920 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be245fb2-4ef3-4642-aae0-14954ab28ffa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlq47\" (UniqueName: \"kubernetes.io/projected/be245fb2-4ef3-4642-aae0-14954ab28ffa-kube-api-access-wlq47\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47109aa-f36b-4a01-89d4-832ff0a7a700-config\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831973 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xnxk\" (UniqueName: \"kubernetes.io/projected/f295e287-05b6-45e1-bfd5-3c71d7a87f15-kube-api-access-9xnxk\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.831996 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x654k\" (UniqueName: \"kubernetes.io/projected/a47109aa-f36b-4a01-89d4-832ff0a7a700-kube-api-access-x654k\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832020 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7a64280-2fd1-4149-826a-1f0daed66dc1-metrics-tls\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-node-bootstrap-token\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832053 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/722bda9f-5a8b-4c83-8b1f-790da0003ce9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832076 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832093 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cqbz\" (UniqueName: \"kubernetes.io/projected/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-kube-api-access-6cqbz\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832110 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832127 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-apiservice-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk7lz\" (UniqueName: \"kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832307 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x7f5\" (UniqueName: \"kubernetes.io/projected/b7a64280-2fd1-4149-826a-1f0daed66dc1-kube-api-access-9x7f5\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832326 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-socket-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832351 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfmb7\" (UniqueName: \"kubernetes.io/projected/6bedd3e9-4212-4b9b-866a-a473d7f1c632-kube-api-access-tfmb7\") pod \"migrator-59844c95c7-r92c5\" (UID: \"6bedd3e9-4212-4b9b-866a-a473d7f1c632\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832387 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-webhook-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832405 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-registration-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832422 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832440 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6jd9\" (UniqueName: \"kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832455 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-key\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832472 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mhrk\" (UniqueName: \"kubernetes.io/projected/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-kube-api-access-9mhrk\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832510 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chjf2\" (UniqueName: \"kubernetes.io/projected/028dfe05-0d8f-4d6f-b5f4-af641b911b52-kube-api-access-chjf2\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832526 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82fb9557-bbfb-42e4-ba6c-522685082e66-cert\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832620 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832977 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hl8d\" (UniqueName: \"kubernetes.io/projected/82fb9557-bbfb-42e4-ba6c-522685082e66-kube-api-access-2hl8d\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833014 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xghn\" (UniqueName: \"kubernetes.io/projected/54f66031-6300-4334-8a24-bfe02897b467-kube-api-access-7xghn\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833108 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833124 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-csi-data-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833140 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j97kh\" (UniqueName: \"kubernetes.io/projected/722bda9f-5a8b-4c83-8b1f-790da0003ce9-kube-api-access-j97kh\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833160 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a47109aa-f36b-4a01-89d4-832ff0a7a700-serving-cert\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833176 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-mountpoint-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833181 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f295e287-05b6-45e1-bfd5-3c71d7a87f15-tmpfs\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.833960 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-cabundle\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.834352 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-socket-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.834500 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.334483861 +0000 UTC m=+144.425885421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.836269 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-registration-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.835878 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-csi-data-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.832419 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7a64280-2fd1-4149-826a-1f0daed66dc1-config-volume\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.839037 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/54f66031-6300-4334-8a24-bfe02897b467-mountpoint-dir\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.842581 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.843455 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b7a64280-2fd1-4149-826a-1f0daed66dc1-metrics-tls\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.844310 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.844933 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.845408 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47109aa-f36b-4a01-89d4-832ff0a7a700-config\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.845696 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/be245fb2-4ef3-4642-aae0-14954ab28ffa-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.846804 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82fb9557-bbfb-42e4-ba6c-522685082e66-cert\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.848312 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.850514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/722bda9f-5a8b-4c83-8b1f-790da0003ce9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.851383 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-apiservice-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.852123 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.852466 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-certs\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.856654 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/028dfe05-0d8f-4d6f-b5f4-af641b911b52-signing-key\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.857155 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-node-bootstrap-token\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.858597 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.861305 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f295e287-05b6-45e1-bfd5-3c71d7a87f15-webhook-cert\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.863061 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a47109aa-f36b-4a01-89d4-832ff0a7a700-serving-cert\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.867105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlq47\" (UniqueName: \"kubernetes.io/projected/be245fb2-4ef3-4642-aae0-14954ab28ffa-kube-api-access-wlq47\") pod \"multus-admission-controller-857f4d67dd-rvxdk\" (UID: \"be245fb2-4ef3-4642-aae0-14954ab28ffa\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.894071 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6jd9\" (UniqueName: \"kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9\") pod \"collect-profiles-29500470-ncqjg\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.916353 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfmb7\" (UniqueName: \"kubernetes.io/projected/6bedd3e9-4212-4b9b-866a-a473d7f1c632-kube-api-access-tfmb7\") pod \"migrator-59844c95c7-r92c5\" (UID: \"6bedd3e9-4212-4b9b-866a-a473d7f1c632\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.935157 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:22 crc kubenswrapper[4845]: E0202 10:34:22.935623 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.435607872 +0000 UTC m=+144.527009322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.941035 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cqbz\" (UniqueName: \"kubernetes.io/projected/ccb6c76e-4ee2-4dcc-91a9-c91e25299780-kube-api-access-6cqbz\") pod \"package-server-manager-789f6589d5-s7qmj\" (UID: \"ccb6c76e-4ee2-4dcc-91a9-c91e25299780\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.943630 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.954725 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf"] Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.959545 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hl8d\" (UniqueName: \"kubernetes.io/projected/82fb9557-bbfb-42e4-ba6c-522685082e66-kube-api-access-2hl8d\") pod \"ingress-canary-mgqnl\" (UID: \"82fb9557-bbfb-42e4-ba6c-522685082e66\") " pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.990664 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk7lz\" (UniqueName: \"kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz\") pod \"marketplace-operator-79b997595-hr44j\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:22 crc kubenswrapper[4845]: I0202 10:34:22.996349 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x7f5\" (UniqueName: \"kubernetes.io/projected/b7a64280-2fd1-4149-826a-1f0daed66dc1-kube-api-access-9x7f5\") pod \"dns-default-f2fvl\" (UID: \"b7a64280-2fd1-4149-826a-1f0daed66dc1\") " pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.012269 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mhrk\" (UniqueName: \"kubernetes.io/projected/0b2a5cbc-1208-4d37-be25-4d333adfb8f6-kube-api-access-9mhrk\") pod \"machine-config-server-wc6bh\" (UID: \"0b2a5cbc-1208-4d37-be25-4d333adfb8f6\") " pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.020337 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.030895 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xghn\" (UniqueName: \"kubernetes.io/projected/54f66031-6300-4334-8a24-bfe02897b467-kube-api-access-7xghn\") pod \"csi-hostpathplugin-wsz25\" (UID: \"54f66031-6300-4334-8a24-bfe02897b467\") " pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.036390 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.036711 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.536697312 +0000 UTC m=+144.628098762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.037837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.046584 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.052792 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chjf2\" (UniqueName: \"kubernetes.io/projected/028dfe05-0d8f-4d6f-b5f4-af641b911b52-kube-api-access-chjf2\") pod \"service-ca-9c57cc56f-vq42n\" (UID: \"028dfe05-0d8f-4d6f-b5f4-af641b911b52\") " pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.060197 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.069423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x654k\" (UniqueName: \"kubernetes.io/projected/a47109aa-f36b-4a01-89d4-832ff0a7a700-kube-api-access-x654k\") pod \"service-ca-operator-777779d784-fz66j\" (UID: \"a47109aa-f36b-4a01-89d4-832ff0a7a700\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.089830 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j97kh\" (UniqueName: \"kubernetes.io/projected/722bda9f-5a8b-4c83-8b1f-790da0003ce9-kube-api-access-j97kh\") pod \"control-plane-machine-set-operator-78cbb6b69f-c46tw\" (UID: \"722bda9f-5a8b-4c83-8b1f-790da0003ce9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.095038 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw"] Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.101450 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81ab0690_627e_4d43_b80c_3b3f96b06249.slice/crio-2b8a69512cfb9d9dd4b38eb8d78e0cce29c4dc9efcda9772c59b456a376af80d WatchSource:0}: Error finding container 2b8a69512cfb9d9dd4b38eb8d78e0cce29c4dc9efcda9772c59b456a376af80d: Status 404 returned error can't find the container with id 2b8a69512cfb9d9dd4b38eb8d78e0cce29c4dc9efcda9772c59b456a376af80d Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.103021 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad81540_c66f_4f41_98a5_12aa607142fd.slice/crio-c8c1223c8850e3e25947a1ef508bacce0d25138ed4ae9fcde550f956bd922c8b WatchSource:0}: Error finding container c8c1223c8850e3e25947a1ef508bacce0d25138ed4ae9fcde550f956bd922c8b: Status 404 returned error can't find the container with id c8c1223c8850e3e25947a1ef508bacce0d25138ed4ae9fcde550f956bd922c8b Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.112162 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.118818 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a7dd651_1a0c_43b7_8c52_525200a7146c.slice/crio-6451af35a2b974a31dbc52d12a449d3f3d7c0b3e202be8504cf3c103e1bf1242 WatchSource:0}: Error finding container 6451af35a2b974a31dbc52d12a449d3f3d7c0b3e202be8504cf3c103e1bf1242: Status 404 returned error can't find the container with id 6451af35a2b974a31dbc52d12a449d3f3d7c0b3e202be8504cf3c103e1bf1242 Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.125318 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.135537 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xnxk\" (UniqueName: \"kubernetes.io/projected/f295e287-05b6-45e1-bfd5-3c71d7a87f15-kube-api-access-9xnxk\") pod \"packageserver-d55dfcdfc-c5c85\" (UID: \"f295e287-05b6-45e1-bfd5-3c71d7a87f15\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.137195 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.137327 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.137651 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.637631228 +0000 UTC m=+144.729032738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.150112 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.161021 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.167374 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wc6bh" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.180153 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mgqnl" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.180349 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.198438 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.204595 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.230555 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.238574 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.238857 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.738846762 +0000 UTC m=+144.830248212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.245852 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.264445 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-2vcpg"] Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.287330 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d41e42_423a_4bac_bc05_3c424c978fd8.slice/crio-983de89da05d00215d904c21a6798495b0919b3a61d3d8890b2f47a3f16bcb7f WatchSource:0}: Error finding container 983de89da05d00215d904c21a6798495b0919b3a61d3d8890b2f47a3f16bcb7f: Status 404 returned error can't find the container with id 983de89da05d00215d904c21a6798495b0919b3a61d3d8890b2f47a3f16bcb7f Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.319939 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f9a812_d62e_44ca_b83f_5f240ede92a0.slice/crio-8bcad2e78a62e68573e117efc6dc671b25a9e715a587b9e0fe3d8baefac4f023 WatchSource:0}: Error finding container 8bcad2e78a62e68573e117efc6dc671b25a9e715a587b9e0fe3d8baefac4f023: Status 404 returned error can't find the container with id 8bcad2e78a62e68573e117efc6dc671b25a9e715a587b9e0fe3d8baefac4f023 Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.339364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.340043 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.840025704 +0000 UTC m=+144.931427154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.363760 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.383098 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.400054 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.442496 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.442546 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.443494 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.444115 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:23.944098211 +0000 UTC m=+145.035499671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.539159 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" event={"ID":"1eb84a90-bb90-4b5f-9e30-7415cb27cd39","Type":"ContainerStarted","Data":"557f339ef62a68f82cafbdec4d59ac3717807470c7e72a477ae014a95e0d4fa8"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.539211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" event={"ID":"1eb84a90-bb90-4b5f-9e30-7415cb27cd39","Type":"ContainerStarted","Data":"95ffa56c127374c1d2cf12b892f47ee7784e1f0ddfa880bf6d39b092c0c0ab47"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.545711 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.545870 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.0458467 +0000 UTC m=+145.137248150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.548158 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.548573 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.048559729 +0000 UTC m=+145.139961179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.548687 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" event={"ID":"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9","Type":"ContainerStarted","Data":"5b6d1b889f30e64bb630226b12a8dec11c71a3e29ecb559bc71709271b41b56a"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.552245 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" event={"ID":"8dbf6657-96c2-472f-9e4c-0745a4c249be","Type":"ContainerStarted","Data":"bc7cad6e4a8caca58a375e3d6dbb0c656d2b38d5e89f85b77c6e6bd886778b12"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.558612 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" event={"ID":"d02c35e0-ade4-4316-b4da-88c6dd349220","Type":"ContainerStarted","Data":"2887a44009767064989b7afdb40a163bd61e4a12e17734f65b637e9f5ac66eb6"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.564782 4845 generic.go:334] "Generic (PLEG): container finished" podID="6de6b4aa-d335-4eb0-b880-7a21c9336ebf" containerID="91bd591227b78f9c1f0d5b5f4151094dc6149a69b81bc559948c2239bb684c96" exitCode=0 Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.564834 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" event={"ID":"6de6b4aa-d335-4eb0-b880-7a21c9336ebf","Type":"ContainerDied","Data":"91bd591227b78f9c1f0d5b5f4151094dc6149a69b81bc559948c2239bb684c96"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.564856 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" event={"ID":"6de6b4aa-d335-4eb0-b880-7a21c9336ebf","Type":"ContainerStarted","Data":"11ea871b8e34a0eef6934dd0d4b800fe4ce3f391c95fdd680371c20dfc11e944"} Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.566258 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d758a8_6722_4c1b_be56_fe2bb6d27830.slice/crio-87b2943d2cc3d4173e627e67c4f05ab2812f4c96bb4e7fd389c8fdd788324d2b WatchSource:0}: Error finding container 87b2943d2cc3d4173e627e67c4f05ab2812f4c96bb4e7fd389c8fdd788324d2b: Status 404 returned error can't find the container with id 87b2943d2cc3d4173e627e67c4f05ab2812f4c96bb4e7fd389c8fdd788324d2b Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.567737 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" event={"ID":"f8a6e66d-c97e-43eb-8ff0-864e543f5488","Type":"ContainerStarted","Data":"47f118d00a342510dc1eb1714c5308c9aac6e9fef7bd7daefc6a0ad7f74e5995"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.578912 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.586934 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.596415 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.601325 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" event={"ID":"0bc78651-e3a0-4988-acfa-89a6391f4aa5","Type":"ContainerStarted","Data":"8321bcfb26cb6f4f0dd226d6bce488ebee57d77db38a8e8691052e7c8969b3dc"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.602985 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.605693 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" event={"ID":"aa7bf903-1f8f-4d7c-b5a1-33a07160500f","Type":"ContainerStarted","Data":"e847ef4d6fd85f53465f7476e19b5522eebcea516d5ef0150abc2947c4be1ccc"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.627612 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" event={"ID":"1b5822e3-ccfc-4261-9a39-8f02356add90","Type":"ContainerStarted","Data":"e7e9916fbdb937e78d4a7ec3652add0eba916379930c8fe6e2b49871f597ea2f"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.638585 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" event={"ID":"d1f9a812-d62e-44ca-b83f-5f240ede92a0","Type":"ContainerStarted","Data":"8bcad2e78a62e68573e117efc6dc671b25a9e715a587b9e0fe3d8baefac4f023"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.639706 4845 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-t62rp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.639747 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" podUID="0bc78651-e3a0-4988-acfa-89a6391f4aa5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.644721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" event={"ID":"ac875c91-285a-420b-9065-50af53ab50d3","Type":"ContainerStarted","Data":"1fcb883e8f41725d44677baa16381c121b0232bd5110fcb8fbbbde75a1dff506"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.645083 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.648661 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.648879 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.148859856 +0000 UTC m=+145.240261306 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.648979 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.651305 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.151289816 +0000 UTC m=+145.242691266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.655068 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" event={"ID":"3a7dd651-1a0c-43b7-8c52-525200a7146c","Type":"ContainerStarted","Data":"6451af35a2b974a31dbc52d12a449d3f3d7c0b3e202be8504cf3c103e1bf1242"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.657362 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4rbqr" event={"ID":"a70e2a3d-9afe-4437-b9ef-fe175eee93d6","Type":"ContainerStarted","Data":"b9ba1eba59ccca8d0b46f34f7d248688065dda041c0ab8d79afac8d4f083e5d7"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.657382 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4rbqr" event={"ID":"a70e2a3d-9afe-4437-b9ef-fe175eee93d6","Type":"ContainerStarted","Data":"7479615bc5acb0d95dc873a7b6422c041138cc9e9f9895eb879626d491a81011"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.658058 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.668390 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6szh7" podStartSLOduration=123.668370681 podStartE2EDuration="2m3.668370681s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:23.663978914 +0000 UTC m=+144.755380364" watchObservedRunningTime="2026-02-02 10:34:23.668370681 +0000 UTC m=+144.759772131" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.663863 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-4rbqr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.669062 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4rbqr" podUID="a70e2a3d-9afe-4437-b9ef-fe175eee93d6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.683850 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" event={"ID":"3ad81540-c66f-4f41-98a5-12aa607142fd","Type":"ContainerStarted","Data":"c8c1223c8850e3e25947a1ef508bacce0d25138ed4ae9fcde550f956bd922c8b"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.689546 4845 generic.go:334] "Generic (PLEG): container finished" podID="66ae9a2f-1c24-4a65-b961-bd9431c667f6" containerID="a24494154dda6f22ac9de519e84094a42d7649e179a60d5f4c7c098a52187a04" exitCode=0 Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.689594 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" event={"ID":"66ae9a2f-1c24-4a65-b961-bd9431c667f6","Type":"ContainerDied","Data":"a24494154dda6f22ac9de519e84094a42d7649e179a60d5f4c7c098a52187a04"} Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.692988 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd9461a_3591_4e69_a9fd_2fd7de4d84cd.slice/crio-3b0e906a8843ecd4cf6e3849e6d3f269dfbf6989409b6ad8fb10a3cbc5ff1f7a WatchSource:0}: Error finding container 3b0e906a8843ecd4cf6e3849e6d3f269dfbf6989409b6ad8fb10a3cbc5ff1f7a: Status 404 returned error can't find the container with id 3b0e906a8843ecd4cf6e3849e6d3f269dfbf6989409b6ad8fb10a3cbc5ff1f7a Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.694067 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb6c76e_4ee2_4dcc_91a9_c91e25299780.slice/crio-814d327cd7c7f0f1b425f19b9e9e4776ff7d9e8491a231c94d8778274b06a4ff WatchSource:0}: Error finding container 814d327cd7c7f0f1b425f19b9e9e4776ff7d9e8491a231c94d8778274b06a4ff: Status 404 returned error can't find the container with id 814d327cd7c7f0f1b425f19b9e9e4776ff7d9e8491a231c94d8778274b06a4ff Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.695624 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8gjpm" event={"ID":"04d41e42-423a-4bac-bc05-3c424c978fd8","Type":"ContainerStarted","Data":"983de89da05d00215d904c21a6798495b0919b3a61d3d8890b2f47a3f16bcb7f"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.706266 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" event={"ID":"81ab0690-627e-4d43-b80c-3b3f96b06249","Type":"ContainerStarted","Data":"2b8a69512cfb9d9dd4b38eb8d78e0cce29c4dc9efcda9772c59b456a376af80d"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.711774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" event={"ID":"c29a9366-4664-4228-af51-b56b63c976b6","Type":"ContainerStarted","Data":"cc18f294854dbb97b7db62b4d8ad87760ce4257c1fd5ea150cda370bd9cdde14"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.711807 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" event={"ID":"c29a9366-4664-4228-af51-b56b63c976b6","Type":"ContainerStarted","Data":"f729fd0ab07975c64a8bf46c2b75ee8ebbbb431631f9eabfbd59b576e27c02a4"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.739069 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" event={"ID":"6bf70521-8fdf-400f-b7cd-d96b609b4783","Type":"ContainerStarted","Data":"8715d78959c283955471ec0450d5a36f2662bb391ddd5cb8291fa26867112699"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.739099 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" event={"ID":"6bf70521-8fdf-400f-b7cd-d96b609b4783","Type":"ContainerStarted","Data":"7688cc46374e131bdea5caaf1914a891f91babd65e68afda0e1d0ed55aa31e13"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.739111 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vq42n"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.744952 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rbhk2" event={"ID":"8291b32a-8322-4027-af13-cd9f10390406","Type":"ContainerStarted","Data":"1075bda810646faa2c991a9e04f27770b88a84303d9b9e36fd2dc8f540cb4d92"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.751026 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.752843 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.252825169 +0000 UTC m=+145.344226619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.770533 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" event={"ID":"98d47741-7063-487f-a38b-b9c398f3e07e","Type":"ContainerStarted","Data":"7e70d90a9fcf67e25e1301d6daeb2cdf5956e3a601c854abda0627c2816b60da"} Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.810245 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.854750 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.855929 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.355907477 +0000 UTC m=+145.447308927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.857649 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rvxdk"] Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.887508 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6szh7" Feb 02 10:34:23 crc kubenswrapper[4845]: W0202 10:34:23.889175 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe245fb2_4ef3_4642_aae0_14954ab28ffa.slice/crio-314e29ad8f84e0e7dd86789011160c80b856dada5fd6f295211f6b975ef5647d WatchSource:0}: Error finding container 314e29ad8f84e0e7dd86789011160c80b856dada5fd6f295211f6b975ef5647d: Status 404 returned error can't find the container with id 314e29ad8f84e0e7dd86789011160c80b856dada5fd6f295211f6b975ef5647d Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.959937 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:23 crc kubenswrapper[4845]: E0202 10:34:23.960605 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.460588852 +0000 UTC m=+145.551990302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:23 crc kubenswrapper[4845]: I0202 10:34:23.977735 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.067745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.068099 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.568088307 +0000 UTC m=+145.659489757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.106822 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f2fvl"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.107283 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.143950 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fz66j"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.146358 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wsz25"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.168844 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.169214 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.669197908 +0000 UTC m=+145.760599358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.207655 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw"] Feb 02 10:34:24 crc kubenswrapper[4845]: W0202 10:34:24.235427 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a64280_2fd1_4149_826a_1f0daed66dc1.slice/crio-a30b8f2c0a8db239b6d988e96744ca4d9e974c39638b428b14f8c460b33bdb8e WatchSource:0}: Error finding container a30b8f2c0a8db239b6d988e96744ca4d9e974c39638b428b14f8c460b33bdb8e: Status 404 returned error can't find the container with id a30b8f2c0a8db239b6d988e96744ca4d9e974c39638b428b14f8c460b33bdb8e Feb 02 10:34:24 crc kubenswrapper[4845]: W0202 10:34:24.253818 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54f66031_6300_4334_8a24_bfe02897b467.slice/crio-8a704df30240b126ce68777f74cfd7d8b364c39667844dfaba4bffb00e186fae WatchSource:0}: Error finding container 8a704df30240b126ce68777f74cfd7d8b364c39667844dfaba4bffb00e186fae: Status 404 returned error can't find the container with id 8a704df30240b126ce68777f74cfd7d8b364c39667844dfaba4bffb00e186fae Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.270355 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.270687 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.77067643 +0000 UTC m=+145.862077880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.347685 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.351975 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:24 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:24 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:24 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.352020 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.374431 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.374551 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.874510909 +0000 UTC m=+145.965912369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.374692 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.375056 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.875045835 +0000 UTC m=+145.966447285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.403579 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.485203 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.485817 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:24.985799215 +0000 UTC m=+146.077200665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.524125 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mgqnl"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.533767 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85"] Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.586593 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.586992 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.086977598 +0000 UTC m=+146.178379058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.597369 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" podStartSLOduration=123.597346938 podStartE2EDuration="2m3.597346938s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.596056231 +0000 UTC m=+145.687457711" watchObservedRunningTime="2026-02-02 10:34:24.597346938 +0000 UTC m=+145.688748398" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.645899 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4rbqr" podStartSLOduration=124.645860464 podStartE2EDuration="2m4.645860464s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.64363649 +0000 UTC m=+145.735037940" watchObservedRunningTime="2026-02-02 10:34:24.645860464 +0000 UTC m=+145.737261914" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.668115 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xk8gn" podStartSLOduration=123.668099579 podStartE2EDuration="2m3.668099579s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.666526123 +0000 UTC m=+145.757927603" watchObservedRunningTime="2026-02-02 10:34:24.668099579 +0000 UTC m=+145.759501029" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.695253 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.695637 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.195623407 +0000 UTC m=+146.287024857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.751537 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rbhk2" podStartSLOduration=123.751519387 podStartE2EDuration="2m3.751519387s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.748987743 +0000 UTC m=+145.840389193" watchObservedRunningTime="2026-02-02 10:34:24.751519387 +0000 UTC m=+145.842920837" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.795383 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8gjpm" event={"ID":"04d41e42-423a-4bac-bc05-3c424c978fd8","Type":"ContainerStarted","Data":"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.796479 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.796924 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.296911813 +0000 UTC m=+146.388313263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.808839 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" event={"ID":"daa4c1cf-5cd2-4dba-8ddb-543a716a4628","Type":"ContainerStarted","Data":"3de7c534b24a3dec680dcab862d65cd483c1db000fa370db2efab2e3e6767386"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.808907 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" event={"ID":"daa4c1cf-5cd2-4dba-8ddb-543a716a4628","Type":"ContainerStarted","Data":"829cda8afcfa6e37a62d80832e5473a03e1782a9812f40517469d21c1208c661"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.826074 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" event={"ID":"f8a6e66d-c97e-43eb-8ff0-864e543f5488","Type":"ContainerStarted","Data":"fd8b74886f7593d6c5904904a6a2172128ea7c3d810ed91683593d31c4cdd201"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.849134 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" podStartSLOduration=123.849119986 podStartE2EDuration="2m3.849119986s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.787264183 +0000 UTC m=+145.878665643" watchObservedRunningTime="2026-02-02 10:34:24.849119986 +0000 UTC m=+145.940521436" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.851485 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" event={"ID":"54f66031-6300-4334-8a24-bfe02897b467","Type":"ContainerStarted","Data":"8a704df30240b126ce68777f74cfd7d8b364c39667844dfaba4bffb00e186fae"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.897858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" event={"ID":"be245fb2-4ef3-4642-aae0-14954ab28ffa","Type":"ContainerStarted","Data":"314e29ad8f84e0e7dd86789011160c80b856dada5fd6f295211f6b975ef5647d"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.899135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:24 crc kubenswrapper[4845]: E0202 10:34:24.899468 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.399436514 +0000 UTC m=+146.490837964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.921161 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" event={"ID":"2b15a9aa-d443-4058-8e02-0f0eafbd7dd9","Type":"ContainerStarted","Data":"0930e530cc937d4a8c4fcbcb2b9c11527383b8973b22bb7ef168b7c76196d1f6"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.923774 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rch6p" podStartSLOduration=124.923754469 podStartE2EDuration="2m4.923754469s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.920455814 +0000 UTC m=+146.011857264" watchObservedRunningTime="2026-02-02 10:34:24.923754469 +0000 UTC m=+146.015155919" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.930763 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" event={"ID":"6de6b4aa-d335-4eb0-b880-7a21c9336ebf","Type":"ContainerStarted","Data":"ac061238d93eaceea2c3c153ac595f65ac8da4d25d1d4e6fba424a5c25dee01b"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.931455 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.935586 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" event={"ID":"0bc78651-e3a0-4988-acfa-89a6391f4aa5","Type":"ContainerStarted","Data":"0833a582c1546aeb6322237fde9c5e15066b8c7aa72d979f3b37512bdf6cf54b"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.947510 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" event={"ID":"6bedd3e9-4212-4b9b-866a-a473d7f1c632","Type":"ContainerStarted","Data":"818e18a70080750b6efc8d22b53b542d100d5c445a3ff5d753ecdc32a2f77394"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.959950 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-t62rp" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.961521 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8gjpm" podStartSLOduration=124.961510464 podStartE2EDuration="2m4.961510464s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.955025796 +0000 UTC m=+146.046427246" watchObservedRunningTime="2026-02-02 10:34:24.961510464 +0000 UTC m=+146.052911914" Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.968165 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" event={"ID":"81ab0690-627e-4d43-b80c-3b3f96b06249","Type":"ContainerStarted","Data":"15544f89f02f2d0c7bc9f81b6b0bf6862f621c35cad77beee5e5aca14d31c82c"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.987762 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" event={"ID":"aa7bf903-1f8f-4d7c-b5a1-33a07160500f","Type":"ContainerStarted","Data":"000968c3aaeba68df1e5aa3e39dce40cc677ba4ca087fc6e592bb5257d50c063"} Feb 02 10:34:24 crc kubenswrapper[4845]: I0202 10:34:24.990641 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mgqnl" event={"ID":"82fb9557-bbfb-42e4-ba6c-522685082e66","Type":"ContainerStarted","Data":"04ea9b2c57c8554b0c87b059c86647c80058d6f975f895eeaf19b901ed4d8a07"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.004418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" event={"ID":"e903551f-3d78-4de4-a08a-ce9ea234942c","Type":"ContainerStarted","Data":"969afdc837fa269d9afb2c4d0b10e8fedcbb33e325a20f2dca14132d0a8bcb66"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.004470 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" event={"ID":"e903551f-3d78-4de4-a08a-ce9ea234942c","Type":"ContainerStarted","Data":"36711d5d688314e1c7a50c51aa64c94a7e064c0ee700e52fa409c4f4c1c90839"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.005184 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.005572 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.50555732 +0000 UTC m=+146.596958770 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.035802 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-pt97w" podStartSLOduration=125.035781506 podStartE2EDuration="2m5.035781506s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:24.985489529 +0000 UTC m=+146.076890979" watchObservedRunningTime="2026-02-02 10:34:25.035781506 +0000 UTC m=+146.127182956" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.066478 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" event={"ID":"5bcf9211-edc2-4706-a9ac-b5f38b856186","Type":"ContainerStarted","Data":"c4dd4a396199592b0a6e6a2eac4ea47d74e443293291f751e0d5d35a972b367d"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.066543 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" event={"ID":"5bcf9211-edc2-4706-a9ac-b5f38b856186","Type":"ContainerStarted","Data":"894d57605e7e68cc219bd424e8cd3a2a5de95835d8d798f779b61c0f5123fa5e"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.069134 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" podStartSLOduration=125.069093922 podStartE2EDuration="2m5.069093922s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.037697752 +0000 UTC m=+146.129099202" watchObservedRunningTime="2026-02-02 10:34:25.069093922 +0000 UTC m=+146.160495372" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.076595 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfh2j" podStartSLOduration=125.076580149 podStartE2EDuration="2m5.076580149s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.064474228 +0000 UTC m=+146.155875688" watchObservedRunningTime="2026-02-02 10:34:25.076580149 +0000 UTC m=+146.167981599" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.106786 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.108355 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-25kbf" podStartSLOduration=124.108334879 podStartE2EDuration="2m4.108334879s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.105025463 +0000 UTC m=+146.196426913" watchObservedRunningTime="2026-02-02 10:34:25.108334879 +0000 UTC m=+146.199736329" Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.109064 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.60904212 +0000 UTC m=+146.700443580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.137380 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wc6bh" event={"ID":"0b2a5cbc-1208-4d37-be25-4d333adfb8f6","Type":"ContainerStarted","Data":"af2aeeb6d16df5b83775ee46aa1e1142963c553ad5b8907efe73aa04c58538c7"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.137439 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wc6bh" event={"ID":"0b2a5cbc-1208-4d37-be25-4d333adfb8f6","Type":"ContainerStarted","Data":"6d307057403aa8f48e333b18180cfb605fcdbfa9e2931d1077fae8d5e6a4012a"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.146508 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" event={"ID":"f295e287-05b6-45e1-bfd5-3c71d7a87f15","Type":"ContainerStarted","Data":"102c0ff0c549b327001c7a5305c3fbae61954daed1b57772e0162dd791a156d2"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.158605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" event={"ID":"3a7dd651-1a0c-43b7-8c52-525200a7146c","Type":"ContainerStarted","Data":"44acb71bf2d8d0256e9c17b725034395d88f81900558083e9d9ea694b3f8010d"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.212374 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.213591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" event={"ID":"ccb6c76e-4ee2-4dcc-91a9-c91e25299780","Type":"ContainerStarted","Data":"0c742ea686f9e530518fd01a249f2ffc756ee0ec7968b6d68e641ebe68bbdb82"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.213626 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" event={"ID":"ccb6c76e-4ee2-4dcc-91a9-c91e25299780","Type":"ContainerStarted","Data":"814d327cd7c7f0f1b425f19b9e9e4776ff7d9e8491a231c94d8778274b06a4ff"} Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.213676 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.713660222 +0000 UTC m=+146.805061752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.215010 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wc6bh" podStartSLOduration=6.214993171 podStartE2EDuration="6.214993171s" podCreationTimestamp="2026-02-02 10:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.20668874 +0000 UTC m=+146.298090190" watchObservedRunningTime="2026-02-02 10:34:25.214993171 +0000 UTC m=+146.306394621" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.229058 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-f4gzw" podStartSLOduration=124.229035538 podStartE2EDuration="2m4.229035538s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.227784141 +0000 UTC m=+146.319185591" watchObservedRunningTime="2026-02-02 10:34:25.229035538 +0000 UTC m=+146.320436988" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.238223 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" event={"ID":"028dfe05-0d8f-4d6f-b5f4-af641b911b52","Type":"ContainerStarted","Data":"28809ff5e84978fe7b40548668775a4193032fc3227abda1f378e09172539205"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.251084 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" event={"ID":"ee590ca4-c2f6-4dcf-973d-df26701d689f","Type":"ContainerStarted","Data":"ad4ea7f0dae6a93207b9d95af04c1479e13b2c9c74a9493efcae48a316941e9f"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.251122 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" event={"ID":"ee590ca4-c2f6-4dcf-973d-df26701d689f","Type":"ContainerStarted","Data":"334f628fe2fe677e548098919d50f4b0a6a2219adbb3c5e601c061ff9223d142"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.253496 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" event={"ID":"d2ddc114-bfc4-444f-aeb3-8d43d95bec09","Type":"ContainerStarted","Data":"a4725b8556aef79f1893fb492aa385b87083299f9e257b7881345d1bcb5c2f73"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.254341 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.259726 4845 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hr44j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.259782 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.263799 4845 generic.go:334] "Generic (PLEG): container finished" podID="8dbf6657-96c2-472f-9e4c-0745a4c249be" containerID="6fe42116e9a251c4ad584597f07499c48bc249afad969ff74a8a21e23add0298" exitCode=0 Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.263870 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" event={"ID":"8dbf6657-96c2-472f-9e4c-0745a4c249be","Type":"ContainerDied","Data":"6fe42116e9a251c4ad584597f07499c48bc249afad969ff74a8a21e23add0298"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.286845 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" podStartSLOduration=124.286826623 podStartE2EDuration="2m4.286826623s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.285412202 +0000 UTC m=+146.376813652" watchObservedRunningTime="2026-02-02 10:34:25.286826623 +0000 UTC m=+146.378228073" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.304321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" event={"ID":"3ad81540-c66f-4f41-98a5-12aa607142fd","Type":"ContainerStarted","Data":"a810ce009a55cd2c040663024b3101546b89da6e3681e8d1428597eedb73e851"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.306526 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.315301 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.316500 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.816485233 +0000 UTC m=+146.907886683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.318179 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" event={"ID":"98d47741-7063-487f-a38b-b9c398f3e07e","Type":"ContainerStarted","Data":"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.320029 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.322306 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" podStartSLOduration=124.322282431 podStartE2EDuration="2m4.322282431s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.318781339 +0000 UTC m=+146.410182789" watchObservedRunningTime="2026-02-02 10:34:25.322282431 +0000 UTC m=+146.413683881" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.332748 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.337376 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" event={"ID":"d1f9a812-d62e-44ca-b83f-5f240ede92a0","Type":"ContainerStarted","Data":"ac7cd0233c2a1454cbee42a05b5ad14623f0234f364d71ae7f42e532bf7fb259"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.362780 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-m55h8" podStartSLOduration=125.362764254 podStartE2EDuration="2m5.362764254s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.362609809 +0000 UTC m=+146.454011259" watchObservedRunningTime="2026-02-02 10:34:25.362764254 +0000 UTC m=+146.454165704" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.365954 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:25 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:25 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:25 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.366005 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.382027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" event={"ID":"d02c35e0-ade4-4316-b4da-88c6dd349220","Type":"ContainerStarted","Data":"288f218b1dcf5bbec815dfab3c8f5edf4f1b5807cf1cd6745b0be226db11a9e8"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.403004 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" event={"ID":"722bda9f-5a8b-4c83-8b1f-790da0003ce9","Type":"ContainerStarted","Data":"42f5ad96602eb5e50b5958e8e21f18aba8a956169763527c2ae2ef527cf9b0da"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.419263 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.420759 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:25.920744325 +0000 UTC m=+147.012145775 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.462094 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fvl" event={"ID":"b7a64280-2fd1-4149-826a-1f0daed66dc1","Type":"ContainerStarted","Data":"a30b8f2c0a8db239b6d988e96744ca4d9e974c39638b428b14f8c460b33bdb8e"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.496016 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" event={"ID":"1b5822e3-ccfc-4261-9a39-8f02356add90","Type":"ContainerStarted","Data":"adf72df2145a59d69648b8ba66489c83ea3180983f62d0d8950b8d38cb88f311"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.503554 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" podStartSLOduration=125.503539204 podStartE2EDuration="2m5.503539204s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.470789565 +0000 UTC m=+146.562191005" watchObservedRunningTime="2026-02-02 10:34:25.503539204 +0000 UTC m=+146.594940654" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.515140 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" event={"ID":"a47109aa-f36b-4a01-89d4-832ff0a7a700","Type":"ContainerStarted","Data":"a73eac9f6daf9c64976bdda1cc87e3394ca15671f86625d4403db62a979d75d3"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.525545 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.525848 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.025836341 +0000 UTC m=+147.117237791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.541247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" event={"ID":"b5d758a8-6722-4c1b-be56-fe2bb6d27830","Type":"ContainerStarted","Data":"87b2943d2cc3d4173e627e67c4f05ab2812f4c96bb4e7fd389c8fdd788324d2b"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.543108 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zznfs" podStartSLOduration=124.543091601 podStartE2EDuration="2m4.543091601s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.510024782 +0000 UTC m=+146.601426232" watchObservedRunningTime="2026-02-02 10:34:25.543091601 +0000 UTC m=+146.634493051" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.566972 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" event={"ID":"30bde55e-4121-4b71-b6f4-6cb3a9acd82e","Type":"ContainerStarted","Data":"784d840cf0db3fd4333a0324bf44adc6d663afbb97888004d15f7427f54cbe89"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.576220 4845 csr.go:261] certificate signing request csr-jvq7s is approved, waiting to be issued Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.588625 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" event={"ID":"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd","Type":"ContainerStarted","Data":"3b0e906a8843ecd4cf6e3849e6d3f269dfbf6989409b6ad8fb10a3cbc5ff1f7a"} Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.588756 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-4rbqr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.588801 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4rbqr" podUID="a70e2a3d-9afe-4437-b9ef-fe175eee93d6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.589309 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.591021 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bl6zg" podStartSLOduration=126.59100907 podStartE2EDuration="2m6.59100907s" podCreationTimestamp="2026-02-02 10:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.544158342 +0000 UTC m=+146.635559812" watchObservedRunningTime="2026-02-02 10:34:25.59100907 +0000 UTC m=+146.682410520" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.607079 4845 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z9qjh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.607128 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.607815 4845 csr.go:257] certificate signing request csr-jvq7s is issued Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.629047 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.633738 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.133714828 +0000 UTC m=+147.225116278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.640123 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" podStartSLOduration=124.640107623 podStartE2EDuration="2m4.640107623s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.589093014 +0000 UTC m=+146.680494464" watchObservedRunningTime="2026-02-02 10:34:25.640107623 +0000 UTC m=+146.731509073" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.643338 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" podStartSLOduration=124.643308746 podStartE2EDuration="2m4.643308746s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.63864654 +0000 UTC m=+146.730047990" watchObservedRunningTime="2026-02-02 10:34:25.643308746 +0000 UTC m=+146.734710196" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.681059 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" podStartSLOduration=124.681042469 podStartE2EDuration="2m4.681042469s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.675834378 +0000 UTC m=+146.767235828" watchObservedRunningTime="2026-02-02 10:34:25.681042469 +0000 UTC m=+146.772443919" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.702862 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mvp5t" podStartSLOduration=125.702847371 podStartE2EDuration="2m5.702847371s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.702205003 +0000 UTC m=+146.793606453" watchObservedRunningTime="2026-02-02 10:34:25.702847371 +0000 UTC m=+146.794248821" Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.767446 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.767876 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.267854876 +0000 UTC m=+147.359256326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.868873 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.869283 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.369268494 +0000 UTC m=+147.460669944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:25 crc kubenswrapper[4845]: I0202 10:34:25.971475 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:25 crc kubenswrapper[4845]: E0202 10:34:25.972092 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.472078174 +0000 UTC m=+147.563479614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.074004 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.074438 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.574418231 +0000 UTC m=+147.665819781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.175686 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.175898 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.675851931 +0000 UTC m=+147.767253391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.176040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.176451 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.676434697 +0000 UTC m=+147.767836137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.276663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.276871 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.776846048 +0000 UTC m=+147.868247498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.277115 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.277462 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.777453436 +0000 UTC m=+147.868854886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.320259 4845 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-w989s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.33:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.320321 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.33:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.352957 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:26 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:26 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:26 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.353059 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.378457 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.378671 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.878643639 +0000 UTC m=+147.970045099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.378803 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.379206 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.879194294 +0000 UTC m=+147.970595744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.480371 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.480579 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.980553532 +0000 UTC m=+148.071954982 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.481037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.481448 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:26.981437788 +0000 UTC m=+148.072839318 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.582057 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.582295 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.08225261 +0000 UTC m=+148.173654060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.582362 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.582674 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.082661572 +0000 UTC m=+148.174063022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.594213 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vq42n" event={"ID":"028dfe05-0d8f-4d6f-b5f4-af641b911b52","Type":"ContainerStarted","Data":"5b64f630425ff95eb7934e0cf5c431541b5bd437362a1587c9171fd6b89d35c4"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.595715 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" event={"ID":"f8a6e66d-c97e-43eb-8ff0-864e543f5488","Type":"ContainerStarted","Data":"b5a5b0340d9d607f7850ae86c5554d19fb4668620ab068f6ee45e21c5946b57b"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.597658 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" event={"ID":"8dbf6657-96c2-472f-9e4c-0745a4c249be","Type":"ContainerStarted","Data":"c7ff3b764b789d897b9bfd196a743b3b228345d2a5a279295f9f622a525bd1d7"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.599523 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" event={"ID":"66ae9a2f-1c24-4a65-b961-bd9431c667f6","Type":"ContainerStarted","Data":"d93abf78c7f8a52572409ae932a6094ae129c4a95b7589d23cbb0ecf2406399c"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.599565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" event={"ID":"66ae9a2f-1c24-4a65-b961-bd9431c667f6","Type":"ContainerStarted","Data":"52ab081bae665b609acc8b914a4691d13bdfc6cd70a3336411e22ca4912a0bda"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.601047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-m9h2g" event={"ID":"1b5822e3-ccfc-4261-9a39-8f02356add90","Type":"ContainerStarted","Data":"d324e596860192125ed4a47197cbde148a2f7bf8eda70f9f83c6374b1bb45a93"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.602755 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fz66j" event={"ID":"a47109aa-f36b-4a01-89d4-832ff0a7a700","Type":"ContainerStarted","Data":"5298701e10681b445590c289fa3baa5f6316b6015c9fcb07a88c831aa3431d9f"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.604524 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" event={"ID":"d2ddc114-bfc4-444f-aeb3-8d43d95bec09","Type":"ContainerStarted","Data":"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.605238 4845 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hr44j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.605291 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.605996 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" event={"ID":"30bde55e-4121-4b71-b6f4-6cb3a9acd82e","Type":"ContainerStarted","Data":"523a5bcb42e2050e61412c7d355eedcd8752caa8ebcb7f962918e3ce965038aa"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.607361 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" event={"ID":"f295e287-05b6-45e1-bfd5-3c71d7a87f15","Type":"ContainerStarted","Data":"2032d8132587b0205ba2a042609961f10d7df1a4db30f776d5ae33d8f3213dbb"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.607518 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.608531 4845 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c5c85 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" start-of-body= Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.608604 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-02 10:29:25 +0000 UTC, rotation deadline is 2026-11-01 18:32:46.197899056 +0000 UTC Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.608650 4845 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6535h58m19.589251181s for next certificate rotation Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.608643 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" podUID="f295e287-05b6-45e1-bfd5-3c71d7a87f15" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": dial tcp 10.217.0.22:5443: connect: connection refused" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.609632 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fvl" event={"ID":"b7a64280-2fd1-4149-826a-1f0daed66dc1","Type":"ContainerStarted","Data":"ca70ec0f75ce8b4bd33b625b6da7e53c1506ec0e88616865dd0f11fbb10f6045"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.609668 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.609681 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f2fvl" event={"ID":"b7a64280-2fd1-4149-826a-1f0daed66dc1","Type":"ContainerStarted","Data":"251d4707445d8f9e2de4c7325fbce1e984410b11da18dc7756bd01d8d7fbd776"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.610878 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mgqnl" event={"ID":"82fb9557-bbfb-42e4-ba6c-522685082e66","Type":"ContainerStarted","Data":"d7d86aeb9105f2263c176d1c04cc1ecd64b08201304551ffd25cb8509d88e94d"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.612576 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" event={"ID":"d1f9a812-d62e-44ca-b83f-5f240ede92a0","Type":"ContainerStarted","Data":"84f1faaa8fffaf568a69864a03da17d4204af6eca21c94c1ab201ae2b147577c"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.613978 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" event={"ID":"54f66031-6300-4334-8a24-bfe02897b467","Type":"ContainerStarted","Data":"8d6b39cd6b039eb05d9bbac46938448f581611d70cd55520093739ba70b6e48e"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.615515 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" event={"ID":"722bda9f-5a8b-4c83-8b1f-790da0003ce9","Type":"ContainerStarted","Data":"b6977f46f27851e30a28109002704ebc9f01c462d2861c8d4d416ea751cbe172"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.618407 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" event={"ID":"6bedd3e9-4212-4b9b-866a-a473d7f1c632","Type":"ContainerStarted","Data":"cf7cbfb99fbaf34715fba99e46da3e1b5975f03ca13e661dec291e428d188638"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.618477 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" event={"ID":"6bedd3e9-4212-4b9b-866a-a473d7f1c632","Type":"ContainerStarted","Data":"d3248cf068b81a405fb112e6bfaa70139056a8dc42811ffae71a01a06f787453"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.619801 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" event={"ID":"ccb6c76e-4ee2-4dcc-91a9-c91e25299780","Type":"ContainerStarted","Data":"ebe12fc2940989d5d15879bf3386fef5893ce44c58cd002dd7b84b485874cdcc"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.619928 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.621124 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" event={"ID":"e903551f-3d78-4de4-a08a-ce9ea234942c","Type":"ContainerStarted","Data":"806517c86b6a5019159fa17c452bf029de4b7dcb535dec694d01f62269e0cdac"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.624389 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" event={"ID":"5bcf9211-edc2-4706-a9ac-b5f38b856186","Type":"ContainerStarted","Data":"727c4d0934941ab59591696077c9c67e22746af476c0646616af2b3146abd3bb"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.625478 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" event={"ID":"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd","Type":"ContainerStarted","Data":"f9b9887e548d54bb14e6cb1ffbf2477aee082f38d18a932c8121b4adcb811fca"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.626002 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8btfx" podStartSLOduration=126.625986568 podStartE2EDuration="2m6.625986568s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.624397962 +0000 UTC m=+147.715799412" watchObservedRunningTime="2026-02-02 10:34:26.625986568 +0000 UTC m=+147.717388018" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.626291 4845 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z9qjh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.626322 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.626799 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qmvcn" event={"ID":"b5d758a8-6722-4c1b-be56-fe2bb6d27830","Type":"ContainerStarted","Data":"1e8a96abdb190bfa53c7b46f82b607b3ccfbde111881cbba498bc5fa6b956156"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.626981 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" podStartSLOduration=126.626974336 podStartE2EDuration="2m6.626974336s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.737052843 +0000 UTC m=+146.828454283" watchObservedRunningTime="2026-02-02 10:34:26.626974336 +0000 UTC m=+147.718375786" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.629120 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" event={"ID":"be245fb2-4ef3-4642-aae0-14954ab28ffa","Type":"ContainerStarted","Data":"5f5dc25f1124301e18535afe7eb535af68b37b6a2b8da08b80ae3135e606d9c2"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.629148 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" event={"ID":"be245fb2-4ef3-4642-aae0-14954ab28ffa","Type":"ContainerStarted","Data":"59ea93e07ef4be9fc4d6c2fe422b2647eb90fd79fcdd6ada749bcdfcc72b5549"} Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.652550 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.662512 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" podStartSLOduration=125.662494666 podStartE2EDuration="2m5.662494666s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.661619921 +0000 UTC m=+147.753021371" watchObservedRunningTime="2026-02-02 10:34:26.662494666 +0000 UTC m=+147.753896116" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.681946 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f2fvl" podStartSLOduration=6.681930689 podStartE2EDuration="6.681930689s" podCreationTimestamp="2026-02-02 10:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.680743395 +0000 UTC m=+147.772144835" watchObservedRunningTime="2026-02-02 10:34:26.681930689 +0000 UTC m=+147.773332139" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.683377 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.683509 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.183490925 +0000 UTC m=+148.274892375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.684240 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.686142 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.186130721 +0000 UTC m=+148.277532171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.713692 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qx7sb" podStartSLOduration=125.713674549 podStartE2EDuration="2m5.713674549s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.713559886 +0000 UTC m=+147.804961336" watchObservedRunningTime="2026-02-02 10:34:26.713674549 +0000 UTC m=+147.805075999" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.735431 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-2vcpg" podStartSLOduration=126.73541614 podStartE2EDuration="2m6.73541614s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.733677109 +0000 UTC m=+147.825078549" watchObservedRunningTime="2026-02-02 10:34:26.73541614 +0000 UTC m=+147.826817590" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.755992 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-c46tw" podStartSLOduration=125.755975226 podStartE2EDuration="2m5.755975226s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.755157222 +0000 UTC m=+147.846558682" watchObservedRunningTime="2026-02-02 10:34:26.755975226 +0000 UTC m=+147.847376676" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.788161 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.792204 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.292177075 +0000 UTC m=+148.383578535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.808343 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" podStartSLOduration=125.808324813 podStartE2EDuration="2m5.808324813s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.773556545 +0000 UTC m=+147.864957995" watchObservedRunningTime="2026-02-02 10:34:26.808324813 +0000 UTC m=+147.899726273" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.808961 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" podStartSLOduration=125.808954471 podStartE2EDuration="2m5.808954471s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.807601432 +0000 UTC m=+147.899002902" watchObservedRunningTime="2026-02-02 10:34:26.808954471 +0000 UTC m=+147.900355921" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.852162 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-h92cr" podStartSLOduration=126.852144073 podStartE2EDuration="2m6.852144073s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.848321672 +0000 UTC m=+147.939723122" watchObservedRunningTime="2026-02-02 10:34:26.852144073 +0000 UTC m=+147.943545523" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.887188 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" podStartSLOduration=125.887165698 podStartE2EDuration="2m5.887165698s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.886915131 +0000 UTC m=+147.978316581" watchObservedRunningTime="2026-02-02 10:34:26.887165698 +0000 UTC m=+147.978567148" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.891071 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.891356 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.391344379 +0000 UTC m=+148.482745829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.913231 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mgqnl" podStartSLOduration=6.913213953 podStartE2EDuration="6.913213953s" podCreationTimestamp="2026-02-02 10:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.911563635 +0000 UTC m=+148.002965085" watchObservedRunningTime="2026-02-02 10:34:26.913213953 +0000 UTC m=+148.004615403" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.975870 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-r92c5" podStartSLOduration=125.975853029 podStartE2EDuration="2m5.975853029s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.974115378 +0000 UTC m=+148.065516828" watchObservedRunningTime="2026-02-02 10:34:26.975853029 +0000 UTC m=+148.067254479" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.978333 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" podStartSLOduration=126.97832182 podStartE2EDuration="2m6.97832182s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.946445376 +0000 UTC m=+148.037846826" watchObservedRunningTime="2026-02-02 10:34:26.97832182 +0000 UTC m=+148.069723270" Feb 02 10:34:26 crc kubenswrapper[4845]: I0202 10:34:26.992950 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:26 crc kubenswrapper[4845]: E0202 10:34:26.993492 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.49347405 +0000 UTC m=+148.584875510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.027228 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ns95h" podStartSLOduration=126.027195127 podStartE2EDuration="2m6.027195127s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:27.024519509 +0000 UTC m=+148.115920959" watchObservedRunningTime="2026-02-02 10:34:27.027195127 +0000 UTC m=+148.118596577" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.057080 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rvxdk" podStartSLOduration=126.057057373 podStartE2EDuration="2m6.057057373s" podCreationTimestamp="2026-02-02 10:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:27.055132507 +0000 UTC m=+148.146533947" watchObservedRunningTime="2026-02-02 10:34:27.057057373 +0000 UTC m=+148.148458823" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.082592 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.082936 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.083624 4845 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xsdsh container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.083661 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" podUID="66ae9a2f-1c24-4a65-b961-bd9431c667f6" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.29:8443/livez\": dial tcp 10.217.0.29:8443: connect: connection refused" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.096532 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.096826 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.596816225 +0000 UTC m=+148.688217675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.190740 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.191627 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.195283 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.200732 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.200994 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.700971934 +0000 UTC m=+148.792373394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.201141 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.201503 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.701494349 +0000 UTC m=+148.792895809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.229050 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.302650 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.302848 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.802822516 +0000 UTC m=+148.894223966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.302922 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.302968 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.303004 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.303024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55ffd\" (UniqueName: \"kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.303275 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.803264749 +0000 UTC m=+148.894666259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.350809 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:27 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:27 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:27 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.350866 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.395583 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.396572 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.398617 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.404531 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.404766 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.404814 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.404832 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55ffd\" (UniqueName: \"kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.405199 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:27.905183553 +0000 UTC m=+148.996585003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.405899 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.406087 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.420274 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.433378 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55ffd\" (UniqueName: \"kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd\") pod \"community-operators-srnzq\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.506023 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.506647 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.506757 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.506897 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb5dl\" (UniqueName: \"kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.507149 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.007111918 +0000 UTC m=+149.098513368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.515027 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.552036 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.552076 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.588693 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.589591 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.607931 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.608130 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.108083694 +0000 UTC m=+149.199485144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.608344 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.608451 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.608578 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb5dl\" (UniqueName: \"kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.608708 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.608797 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.108780734 +0000 UTC m=+149.200182184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.608835 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.609245 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.636873 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb5dl\" (UniqueName: \"kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl\") pod \"certified-operators-nqqx9\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.652346 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.665259 4845 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hr44j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.665308 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.665527 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" event={"ID":"54f66031-6300-4334-8a24-bfe02897b467","Type":"ContainerStarted","Data":"6b93665d738b5c7b6c7fbc76893ea096a3c10779063bcfaceb066fab2bc7a0c2"} Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.672216 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727255 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727553 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727599 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727637 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727667 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvpsx\" (UniqueName: \"kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727712 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.727990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.737266 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.737365 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.237341771 +0000 UTC m=+149.328743221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.740948 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.749321 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.752160 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.772214 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.787661 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.801768 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.829763 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.830122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.830199 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvpsx\" (UniqueName: \"kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.830222 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.832991 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.332973253 +0000 UTC m=+149.424374823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.835690 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.838023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.843475 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.884815 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvpsx\" (UniqueName: \"kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx\") pod \"community-operators-7pz84\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.903238 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.932833 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.933080 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s42p\" (UniqueName: \"kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.933136 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.933206 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:27 crc kubenswrapper[4845]: E0202 10:34:27.933285 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.43327307 +0000 UTC m=+149.524674520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.942371 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.957997 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:34:27 crc kubenswrapper[4845]: I0202 10:34:27.975134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.034583 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.034621 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.034673 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.034707 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s42p\" (UniqueName: \"kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.035203 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.535192114 +0000 UTC m=+149.626593564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.035637 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.035831 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.071624 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s42p\" (UniqueName: \"kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p\") pod \"certified-operators-566gk\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.136352 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.136536 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.636521221 +0000 UTC m=+149.727922671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.136733 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.137019 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.637011995 +0000 UTC m=+149.728413445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.166699 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.174574 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.237314 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.238007 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.737988762 +0000 UTC m=+149.829390222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.338926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.339251 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.839236547 +0000 UTC m=+149.930637997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.440367 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.440544 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.940518522 +0000 UTC m=+150.031919972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.440632 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.440874 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:28.940866983 +0000 UTC m=+150.032268433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.541505 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.541675 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.041647144 +0000 UTC m=+150.133048594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.541735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.542040 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.042028535 +0000 UTC m=+150.133429985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.642641 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.642813 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.142789885 +0000 UTC m=+150.234191335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.642870 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.643131 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.143124135 +0000 UTC m=+150.234525585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.666755 4845 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c5c85 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.666797 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" podUID="f295e287-05b6-45e1-bfd5-3c71d7a87f15" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.22:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.668584 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerStarted","Data":"7489421b9d82561c3c59320133d54302d24308214065fb740dafa4f42a2056e8"} Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.744090 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.744340 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.244311578 +0000 UTC m=+150.335713018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:28 crc kubenswrapper[4845]: I0202 10:34:28.744810 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:28 crc kubenswrapper[4845]: E0202 10:34:28.745114 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.245106791 +0000 UTC m=+150.336508241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.133474 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.133806 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.633788197 +0000 UTC m=+150.725189647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.194279 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:29 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:29 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:29 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.194872 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.199058 4845 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-x9pr7 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.199166 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" podUID="6de6b4aa-d335-4eb0-b880-7a21c9336ebf" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.200799 4845 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-x9pr7 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.200844 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" podUID="6de6b4aa-d335-4eb0-b880-7a21c9336ebf" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.237255 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.237611 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.737596696 +0000 UTC m=+150.828998146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.339859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.340063 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.340168 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.840146918 +0000 UTC m=+150.931548369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.350442 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:29 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:29 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:29 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.350519 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.443986 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.444981 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:29.944963176 +0000 UTC m=+151.036364616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.492296 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.504962 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.505805 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.518125 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.548311 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.549063 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.049041462 +0000 UTC m=+151.140442912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.564008 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.573585 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.668580 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.668646 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.668670 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.668967 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.168954418 +0000 UTC m=+151.260355868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.773192 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.773775 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.273751976 +0000 UTC m=+151.365153426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.774080 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.774137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.774238 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.774697 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.274678562 +0000 UTC m=+151.366080012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.773586 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerStarted","Data":"411d114dec4634d8215b7fca2758294946426abdcb66d052869b2ccdb984e078"} Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.775039 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.797561 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.806133 4845 generic.go:334] "Generic (PLEG): container finished" podID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerID="bce7fc681aa538cb76a68f4632435369a39f67dafd5cfa64c4b02c6ca032214c" exitCode=0 Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.806408 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerDied","Data":"bce7fc681aa538cb76a68f4632435369a39f67dafd5cfa64c4b02c6ca032214c"} Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.815361 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.843311 4845 generic.go:334] "Generic (PLEG): container finished" podID="30bde55e-4121-4b71-b6f4-6cb3a9acd82e" containerID="523a5bcb42e2050e61412c7d355eedcd8752caa8ebcb7f962918e3ce965038aa" exitCode=0 Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.843406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" event={"ID":"30bde55e-4121-4b71-b6f4-6cb3a9acd82e","Type":"ContainerDied","Data":"523a5bcb42e2050e61412c7d355eedcd8752caa8ebcb7f962918e3ce965038aa"} Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.877060 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.877837 4845 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.878239 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.878546 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.378532193 +0000 UTC m=+151.469933643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.885631 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.922403 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" event={"ID":"54f66031-6300-4334-8a24-bfe02897b467","Type":"ContainerStarted","Data":"822e913b7ae67619853c6c7a2090b6a15de579c20247e3b1e5e88f0c41b9c6b4"} Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.940683 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-49vf9" Feb 02 10:34:29 crc kubenswrapper[4845]: I0202 10:34:29.989113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:29 crc kubenswrapper[4845]: E0202 10:34:29.989463 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.489451338 +0000 UTC m=+151.580852788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.038099 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.085780 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.085966 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.091233 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.092260 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:34:30 crc kubenswrapper[4845]: E0202 10:34:30.092427 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.592412822 +0000 UTC m=+151.683814272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.132557 4845 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-02T10:34:29.877852223Z","Handler":null,"Name":""} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.200528 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.200573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.200623 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8ql7\" (UniqueName: \"kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.200659 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: E0202 10:34:30.211954 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.700917267 +0000 UTC m=+151.792318717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-thf72" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.221585 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.222595 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.228464 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.234096 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.236903 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.301404 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.301644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.301736 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8ql7\" (UniqueName: \"kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.301770 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.302303 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: E0202 10:34:30.302388 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:34:30.802373738 +0000 UTC m=+151.893775188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.303171 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.339194 4845 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.339518 4845 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.350507 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8ql7\" (UniqueName: \"kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7\") pod \"redhat-marketplace-xf5wp\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.373131 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:30 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:30 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:30 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.373195 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.403769 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.403829 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg7pg\" (UniqueName: \"kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.403861 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.403924 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.409984 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.411005 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.413698 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.427741 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.427781 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.481298 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504680 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504711 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7f2m\" (UniqueName: \"kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504743 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504783 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg7pg\" (UniqueName: \"kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504828 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.504842 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.505201 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.505413 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.545209 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg7pg\" (UniqueName: \"kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg\") pod \"redhat-operators-nb8f9\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.570425 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.578788 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.579766 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.606236 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7f2m\" (UniqueName: \"kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.606478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.606609 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.607472 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.608078 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.617653 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.638630 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.682340 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7f2m\" (UniqueName: \"kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m\") pod \"redhat-marketplace-89bwt\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.713941 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.714255 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928mf\" (UniqueName: \"kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.714287 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.737228 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-thf72\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.804056 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.816049 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.816577 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.816647 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-928mf\" (UniqueName: \"kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.816672 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.817582 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.827193 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.841487 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.845385 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-928mf\" (UniqueName: \"kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf\") pod \"redhat-operators-jxk8q\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: E0202 10:34:30.879127 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod149fcd2d_91c2_493a_a1ec_c8675e1901ef.slice/crio-conmon-4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.889837 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:34:30 crc kubenswrapper[4845]: W0202 10:34:30.908284 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ceca4a8_b0dd_47cc_a1fe_818e984af772.slice/crio-0d84e03378117188ed8428e61cf660af9bf78e4894f24fa58ab65c169ebc8078 WatchSource:0}: Error finding container 0d84e03378117188ed8428e61cf660af9bf78e4894f24fa58ab65c169ebc8078: Status 404 returned error can't find the container with id 0d84e03378117188ed8428e61cf660af9bf78e4894f24fa58ab65c169ebc8078 Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.931303 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.956357 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9bfe19dbbe7d1e40546e552ab43d37862197f4f4cfb7d08bb020bb00479fc6ba"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.956407 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d48d1c6c37d0a5e8ad259c05304325a7b427c719454c390b49bf80e8494319ac"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.965464 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" event={"ID":"54f66031-6300-4334-8a24-bfe02897b467","Type":"ContainerStarted","Data":"7714a652cda8199b9ff2d1687cc63840be0d7d6f127a29a22cba6c4a5e814604"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.968130 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"69c8101f-8598-4724-b4b8-404da68760f9","Type":"ContainerStarted","Data":"056e30bf0adad14504d00252db500715c4d9cd3fe159929432752cd0a022f096"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.972728 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1dc524670dd7946b51dffe84822fe5d9ef966942bccaf3e1442ebb4e88426872"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.972776 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8f43a75f7821fc1df1d7f00deeace7d0a569fb02a8aaa97af12caa17613db502"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.988691 4845 generic.go:334] "Generic (PLEG): container finished" podID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerID="79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822" exitCode=0 Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.988777 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerDied","Data":"79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822"} Feb 02 10:34:30 crc kubenswrapper[4845]: I0202 10:34:30.988806 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerStarted","Data":"9bcdab2008f0c99412111dda17b5eb250f02325e2f7935a2aadaa0f85ebf0c92"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.003146 4845 generic.go:334] "Generic (PLEG): container finished" podID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerID="9aa6e5661ba4da1e3f07e47952d04aef1b22875db8155df2a2c40d59b58e9f5c" exitCode=0 Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.003424 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerDied","Data":"9aa6e5661ba4da1e3f07e47952d04aef1b22875db8155df2a2c40d59b58e9f5c"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.010706 4845 generic.go:334] "Generic (PLEG): container finished" podID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerID="4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358" exitCode=0 Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.010773 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerDied","Data":"4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.010797 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerStarted","Data":"9c90348dba10073c8c8427f15a5626e4e8ea2c366d1f13b9219eb720f44735c2"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.023941 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.049687 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e2a63c510b246dde839c36e8a8c8d5dbdbe226d8530c8f27fb153a04a8c8aef5"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.049723 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"acf6c6962b036d9bc4c7a4d78a1dc9f7b42ce529abfeb5c629546e391eba24c4"} Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.050270 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.176354 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wsz25" podStartSLOduration=11.17633466 podStartE2EDuration="11.17633466s" podCreationTimestamp="2026-02-02 10:34:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:31.068456903 +0000 UTC m=+152.159858353" watchObservedRunningTime="2026-02-02 10:34:31.17633466 +0000 UTC m=+152.267736110" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.182169 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.183724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.193077 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.193683 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.206454 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.211226 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-x9pr7" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.239702 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:34:31 crc kubenswrapper[4845]: W0202 10:34:31.258254 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66911d31_17db_4d9e_b0c2_9cb699fc0778.slice/crio-16c241a98cc49754e7cd69effe7e44d81157009a11eabff0c36f137b39003b4c WatchSource:0}: Error finding container 16c241a98cc49754e7cd69effe7e44d81157009a11eabff0c36f137b39003b4c: Status 404 returned error can't find the container with id 16c241a98cc49754e7cd69effe7e44d81157009a11eabff0c36f137b39003b4c Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.329508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.330737 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.369149 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:31 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:31 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:31 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.369202 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.432729 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.432859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.432961 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.464068 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.469225 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:34:31 crc kubenswrapper[4845]: W0202 10:34:31.481775 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346c427b_6ed6_4bac_ae1f_ee2400ab6884.slice/crio-6eeeecc6f7635724fe5c5de24d8b0a5978478aedd321e7117013fb83d5e17bed WatchSource:0}: Error finding container 6eeeecc6f7635724fe5c5de24d8b0a5978478aedd321e7117013fb83d5e17bed: Status 404 returned error can't find the container with id 6eeeecc6f7635724fe5c5de24d8b0a5978478aedd321e7117013fb83d5e17bed Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.523118 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.540490 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.637732 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:31 crc kubenswrapper[4845]: W0202 10:34:31.680831 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1579ee3d_0fc7_456f_a78a_eb18aa7bf2bd.slice/crio-45e45e9dbbedd64142e2fefb82cba902c15048017b203f1719c5975bd5a5ec46 WatchSource:0}: Error finding container 45e45e9dbbedd64142e2fefb82cba902c15048017b203f1719c5975bd5a5ec46: Status 404 returned error can't find the container with id 45e45e9dbbedd64142e2fefb82cba902c15048017b203f1719c5975bd5a5ec46 Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.686706 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.725329 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.737395 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume\") pod \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.737428 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6jd9\" (UniqueName: \"kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9\") pod \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.737523 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume\") pod \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\" (UID: \"30bde55e-4121-4b71-b6f4-6cb3a9acd82e\") " Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.738212 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume" (OuterVolumeSpecName: "config-volume") pod "30bde55e-4121-4b71-b6f4-6cb3a9acd82e" (UID: "30bde55e-4121-4b71-b6f4-6cb3a9acd82e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.747415 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9" (OuterVolumeSpecName: "kube-api-access-j6jd9") pod "30bde55e-4121-4b71-b6f4-6cb3a9acd82e" (UID: "30bde55e-4121-4b71-b6f4-6cb3a9acd82e"). InnerVolumeSpecName "kube-api-access-j6jd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.755685 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30bde55e-4121-4b71-b6f4-6cb3a9acd82e" (UID: "30bde55e-4121-4b71-b6f4-6cb3a9acd82e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.840541 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.840565 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:31 crc kubenswrapper[4845]: I0202 10:34:31.840576 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6jd9\" (UniqueName: \"kubernetes.io/projected/30bde55e-4121-4b71-b6f4-6cb3a9acd82e-kube-api-access-j6jd9\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.040033 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:34:32 crc kubenswrapper[4845]: W0202 10:34:32.052197 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd2246afc_db13_479f_8ce0_fbfd40b28302.slice/crio-0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b WatchSource:0}: Error finding container 0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b: Status 404 returned error can't find the container with id 0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.064559 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerID="df3414b89201cc98711df2db8d1c899c06cc968a324e0b8467c7e36e96868e51" exitCode=0 Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.064626 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerDied","Data":"df3414b89201cc98711df2db8d1c899c06cc968a324e0b8467c7e36e96868e51"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.064649 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerStarted","Data":"0d84e03378117188ed8428e61cf660af9bf78e4894f24fa58ab65c169ebc8078"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.069000 4845 generic.go:334] "Generic (PLEG): container finished" podID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerID="f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175" exitCode=0 Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.069080 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerDied","Data":"f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.069109 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerStarted","Data":"45e45e9dbbedd64142e2fefb82cba902c15048017b203f1719c5975bd5a5ec46"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.086078 4845 generic.go:334] "Generic (PLEG): container finished" podID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerID="ac8e195b0cbc3f127a71603eaf254c9928ef5832347572bb602ee62efcc4964c" exitCode=0 Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.086185 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerDied","Data":"ac8e195b0cbc3f127a71603eaf254c9928ef5832347572bb602ee62efcc4964c"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.086250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerStarted","Data":"6eeeecc6f7635724fe5c5de24d8b0a5978478aedd321e7117013fb83d5e17bed"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.094483 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" event={"ID":"30bde55e-4121-4b71-b6f4-6cb3a9acd82e","Type":"ContainerDied","Data":"784d840cf0db3fd4333a0324bf44adc6d663afbb97888004d15f7427f54cbe89"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.094527 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="784d840cf0db3fd4333a0324bf44adc6d663afbb97888004d15f7427f54cbe89" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.094591 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.115084 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.118334 4845 generic.go:334] "Generic (PLEG): container finished" podID="69c8101f-8598-4724-b4b8-404da68760f9" containerID="160bf3d5bf0112854d2dffd0786d9579ef214a1860260acb1922ad96e7bc5212" exitCode=0 Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.118390 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"69c8101f-8598-4724-b4b8-404da68760f9","Type":"ContainerDied","Data":"160bf3d5bf0112854d2dffd0786d9579ef214a1860260acb1922ad96e7bc5212"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.125032 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xsdsh" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.130459 4845 generic.go:334] "Generic (PLEG): container finished" podID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerID="ce02065ee7ea69eff70e052c4a44aab42be4dcba45489f2ca7fc1dcf482f400b" exitCode=0 Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.130523 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerDied","Data":"ce02065ee7ea69eff70e052c4a44aab42be4dcba45489f2ca7fc1dcf482f400b"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.130584 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerStarted","Data":"16c241a98cc49754e7cd69effe7e44d81157009a11eabff0c36f137b39003b4c"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.175210 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-4rbqr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.175457 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4rbqr" podUID="a70e2a3d-9afe-4437-b9ef-fe175eee93d6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.177303 4845 patch_prober.go:28] interesting pod/downloads-7954f5f757-4rbqr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.177362 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4rbqr" podUID="a70e2a3d-9afe-4437-b9ef-fe175eee93d6" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.180912 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" event={"ID":"339fe372-b3de-4832-b32f-0218d2c0545b","Type":"ContainerStarted","Data":"c850483e123504ea082a0c8f17db4867ee8a686d50fc85724804f4fd70d8bc85"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.180958 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" event={"ID":"339fe372-b3de-4832-b32f-0218d2c0545b","Type":"ContainerStarted","Data":"b02dab70c0d575fa03917d15a317aa67ee29edaf5ecdee7aa9680da7630022b9"} Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.185571 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.357341 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.367773 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:32 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:32 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:32 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.367839 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.590252 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.590336 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.595466 4845 patch_prober.go:28] interesting pod/console-f9d7485db-8gjpm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 02 10:34:32 crc kubenswrapper[4845]: I0202 10:34:32.595527 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8gjpm" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.081043 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.096444 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" podStartSLOduration=133.096426163 podStartE2EDuration="2m13.096426163s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:32.298082004 +0000 UTC m=+153.389483474" watchObservedRunningTime="2026-02-02 10:34:33.096426163 +0000 UTC m=+154.187827613" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.189577 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2246afc-db13-479f-8ce0-fbfd40b28302","Type":"ContainerStarted","Data":"22f42242af9731519d48af37da66970e4887d85a44fa73dc5d26a8dc7828a600"} Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.189653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2246afc-db13-479f-8ce0-fbfd40b28302","Type":"ContainerStarted","Data":"0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b"} Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.203033 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.203017923 podStartE2EDuration="2.203017923s" podCreationTimestamp="2026-02-02 10:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:33.201460478 +0000 UTC m=+154.292861928" watchObservedRunningTime="2026-02-02 10:34:33.203017923 +0000 UTC m=+154.294419373" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.363136 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:33 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:33 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:33 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.363202 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.393984 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c5c85" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.617006 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.682233 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access\") pod \"69c8101f-8598-4724-b4b8-404da68760f9\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.682643 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir\") pod \"69c8101f-8598-4724-b4b8-404da68760f9\" (UID: \"69c8101f-8598-4724-b4b8-404da68760f9\") " Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.683026 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "69c8101f-8598-4724-b4b8-404da68760f9" (UID: "69c8101f-8598-4724-b4b8-404da68760f9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.690921 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "69c8101f-8598-4724-b4b8-404da68760f9" (UID: "69c8101f-8598-4724-b4b8-404da68760f9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.784412 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/69c8101f-8598-4724-b4b8-404da68760f9-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:33 crc kubenswrapper[4845]: I0202 10:34:33.784441 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/69c8101f-8598-4724-b4b8-404da68760f9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.202709 4845 generic.go:334] "Generic (PLEG): container finished" podID="d2246afc-db13-479f-8ce0-fbfd40b28302" containerID="22f42242af9731519d48af37da66970e4887d85a44fa73dc5d26a8dc7828a600" exitCode=0 Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.202829 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2246afc-db13-479f-8ce0-fbfd40b28302","Type":"ContainerDied","Data":"22f42242af9731519d48af37da66970e4887d85a44fa73dc5d26a8dc7828a600"} Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.206766 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"69c8101f-8598-4724-b4b8-404da68760f9","Type":"ContainerDied","Data":"056e30bf0adad14504d00252db500715c4d9cd3fe159929432752cd0a022f096"} Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.206799 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="056e30bf0adad14504d00252db500715c4d9cd3fe159929432752cd0a022f096" Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.206808 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.349950 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:34 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:34 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:34 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:34 crc kubenswrapper[4845]: I0202 10:34:34.350021 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.182639 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f2fvl" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.349127 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:35 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:35 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:35 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.349208 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.628353 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.731198 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir\") pod \"d2246afc-db13-479f-8ce0-fbfd40b28302\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.731301 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access\") pod \"d2246afc-db13-479f-8ce0-fbfd40b28302\" (UID: \"d2246afc-db13-479f-8ce0-fbfd40b28302\") " Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.731988 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d2246afc-db13-479f-8ce0-fbfd40b28302" (UID: "d2246afc-db13-479f-8ce0-fbfd40b28302"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.738055 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d2246afc-db13-479f-8ce0-fbfd40b28302" (UID: "d2246afc-db13-479f-8ce0-fbfd40b28302"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.834161 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2246afc-db13-479f-8ce0-fbfd40b28302-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:35 crc kubenswrapper[4845]: I0202 10:34:35.834210 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2246afc-db13-479f-8ce0-fbfd40b28302-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:36 crc kubenswrapper[4845]: I0202 10:34:36.227964 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d2246afc-db13-479f-8ce0-fbfd40b28302","Type":"ContainerDied","Data":"0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b"} Feb 02 10:34:36 crc kubenswrapper[4845]: I0202 10:34:36.228006 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0870c32ce60ba45e38648a70459251980f5f1985ecf0756c2c03ed4ce2b7941b" Feb 02 10:34:36 crc kubenswrapper[4845]: I0202 10:34:36.228099 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:34:36 crc kubenswrapper[4845]: I0202 10:34:36.349380 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:36 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:36 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:36 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:36 crc kubenswrapper[4845]: I0202 10:34:36.349456 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:37 crc kubenswrapper[4845]: I0202 10:34:37.349669 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:37 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:37 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:37 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:37 crc kubenswrapper[4845]: I0202 10:34:37.350001 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:38 crc kubenswrapper[4845]: I0202 10:34:38.348397 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:38 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:38 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:38 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:38 crc kubenswrapper[4845]: I0202 10:34:38.348453 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:39 crc kubenswrapper[4845]: I0202 10:34:39.349742 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:39 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:39 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:39 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:39 crc kubenswrapper[4845]: I0202 10:34:39.350106 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:40 crc kubenswrapper[4845]: I0202 10:34:40.349273 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:40 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:40 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:40 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:40 crc kubenswrapper[4845]: I0202 10:34:40.349328 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:41 crc kubenswrapper[4845]: I0202 10:34:41.349450 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:41 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:41 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:41 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:41 crc kubenswrapper[4845]: I0202 10:34:41.349517 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.180314 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4rbqr" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.349212 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:42 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:42 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:42 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.349270 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.589922 4845 patch_prober.go:28] interesting pod/console-f9d7485db-8gjpm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.589990 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8gjpm" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.745799 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.756266 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84cb7b66-62e7-4012-ab80-7c5e6ba51e35-metrics-certs\") pod \"network-metrics-daemon-pmn9h\" (UID: \"84cb7b66-62e7-4012-ab80-7c5e6ba51e35\") " pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:42 crc kubenswrapper[4845]: I0202 10:34:42.983517 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pmn9h" Feb 02 10:34:43 crc kubenswrapper[4845]: I0202 10:34:43.358228 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:43 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:43 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:43 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:43 crc kubenswrapper[4845]: I0202 10:34:43.358527 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:44 crc kubenswrapper[4845]: I0202 10:34:44.354839 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:44 crc kubenswrapper[4845]: [-]has-synced failed: reason withheld Feb 02 10:34:44 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:44 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:44 crc kubenswrapper[4845]: I0202 10:34:44.354943 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:45 crc kubenswrapper[4845]: I0202 10:34:45.351142 4845 patch_prober.go:28] interesting pod/router-default-5444994796-rbhk2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:34:45 crc kubenswrapper[4845]: [+]has-synced ok Feb 02 10:34:45 crc kubenswrapper[4845]: [+]process-running ok Feb 02 10:34:45 crc kubenswrapper[4845]: healthz check failed Feb 02 10:34:45 crc kubenswrapper[4845]: I0202 10:34:45.351238 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rbhk2" podUID="8291b32a-8322-4027-af13-cd9f10390406" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:34:46 crc kubenswrapper[4845]: I0202 10:34:46.238383 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:34:46 crc kubenswrapper[4845]: I0202 10:34:46.238468 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:34:46 crc kubenswrapper[4845]: I0202 10:34:46.349357 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:46 crc kubenswrapper[4845]: I0202 10:34:46.352053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rbhk2" Feb 02 10:34:47 crc kubenswrapper[4845]: I0202 10:34:47.025153 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:47 crc kubenswrapper[4845]: I0202 10:34:47.025660 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" containerID="cri-o://f9b9887e548d54bb14e6cb1ffbf2477aee082f38d18a932c8121b4adcb811fca" gracePeriod=30 Feb 02 10:34:47 crc kubenswrapper[4845]: I0202 10:34:47.042278 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:47 crc kubenswrapper[4845]: I0202 10:34:47.042715 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" podUID="ac875c91-285a-420b-9065-50af53ab50d3" containerName="route-controller-manager" containerID="cri-o://1fcb883e8f41725d44677baa16381c121b0232bd5110fcb8fbbbde75a1dff506" gracePeriod=30 Feb 02 10:34:48 crc kubenswrapper[4845]: I0202 10:34:48.326559 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac875c91-285a-420b-9065-50af53ab50d3" containerID="1fcb883e8f41725d44677baa16381c121b0232bd5110fcb8fbbbde75a1dff506" exitCode=0 Feb 02 10:34:48 crc kubenswrapper[4845]: I0202 10:34:48.326646 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" event={"ID":"ac875c91-285a-420b-9065-50af53ab50d3","Type":"ContainerDied","Data":"1fcb883e8f41725d44677baa16381c121b0232bd5110fcb8fbbbde75a1dff506"} Feb 02 10:34:48 crc kubenswrapper[4845]: I0202 10:34:48.328767 4845 generic.go:334] "Generic (PLEG): container finished" podID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerID="f9b9887e548d54bb14e6cb1ffbf2477aee082f38d18a932c8121b4adcb811fca" exitCode=0 Feb 02 10:34:48 crc kubenswrapper[4845]: I0202 10:34:48.328829 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" event={"ID":"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd","Type":"ContainerDied","Data":"f9b9887e548d54bb14e6cb1ffbf2477aee082f38d18a932c8121b4adcb811fca"} Feb 02 10:34:51 crc kubenswrapper[4845]: I0202 10:34:51.028719 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:34:52 crc kubenswrapper[4845]: I0202 10:34:52.594307 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:52 crc kubenswrapper[4845]: I0202 10:34:52.597842 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:34:52 crc kubenswrapper[4845]: I0202 10:34:52.846754 4845 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-z9qjh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 02 10:34:52 crc kubenswrapper[4845]: I0202 10:34:52.846835 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 02 10:34:53 crc kubenswrapper[4845]: I0202 10:34:53.064183 4845 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-jvc49 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:34:53 crc kubenswrapper[4845]: I0202 10:34:53.064240 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" podUID="ac875c91-285a-420b-9065-50af53ab50d3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.511599 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.549498 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:34:54 crc kubenswrapper[4845]: E0202 10:34:54.549801 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c8101f-8598-4724-b4b8-404da68760f9" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.549821 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c8101f-8598-4724-b4b8-404da68760f9" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: E0202 10:34:54.549839 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2246afc-db13-479f-8ce0-fbfd40b28302" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.549850 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2246afc-db13-479f-8ce0-fbfd40b28302" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: E0202 10:34:54.549871 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30bde55e-4121-4b71-b6f4-6cb3a9acd82e" containerName="collect-profiles" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.549916 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="30bde55e-4121-4b71-b6f4-6cb3a9acd82e" containerName="collect-profiles" Feb 02 10:34:54 crc kubenswrapper[4845]: E0202 10:34:54.549936 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac875c91-285a-420b-9065-50af53ab50d3" containerName="route-controller-manager" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.549947 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac875c91-285a-420b-9065-50af53ab50d3" containerName="route-controller-manager" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.550297 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac875c91-285a-420b-9065-50af53ab50d3" containerName="route-controller-manager" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.550318 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c8101f-8598-4724-b4b8-404da68760f9" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.550335 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="30bde55e-4121-4b71-b6f4-6cb3a9acd82e" containerName="collect-profiles" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.550352 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2246afc-db13-479f-8ce0-fbfd40b28302" containerName="pruner" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.550960 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.555498 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.620782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca\") pod \"ac875c91-285a-420b-9065-50af53ab50d3\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.620850 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw74n\" (UniqueName: \"kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n\") pod \"ac875c91-285a-420b-9065-50af53ab50d3\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.620871 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert\") pod \"ac875c91-285a-420b-9065-50af53ab50d3\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.620968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config\") pod \"ac875c91-285a-420b-9065-50af53ab50d3\" (UID: \"ac875c91-285a-420b-9065-50af53ab50d3\") " Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.621581 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac875c91-285a-420b-9065-50af53ab50d3" (UID: "ac875c91-285a-420b-9065-50af53ab50d3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.621617 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config" (OuterVolumeSpecName: "config") pod "ac875c91-285a-420b-9065-50af53ab50d3" (UID: "ac875c91-285a-420b-9065-50af53ab50d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.625716 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac875c91-285a-420b-9065-50af53ab50d3" (UID: "ac875c91-285a-420b-9065-50af53ab50d3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.634438 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n" (OuterVolumeSpecName: "kube-api-access-rw74n") pod "ac875c91-285a-420b-9065-50af53ab50d3" (UID: "ac875c91-285a-420b-9065-50af53ab50d3"). InnerVolumeSpecName "kube-api-access-rw74n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.729534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.729686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzgb\" (UniqueName: \"kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.729833 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.729926 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.730151 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw74n\" (UniqueName: \"kubernetes.io/projected/ac875c91-285a-420b-9065-50af53ab50d3-kube-api-access-rw74n\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.730174 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac875c91-285a-420b-9065-50af53ab50d3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.730186 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.730196 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac875c91-285a-420b-9065-50af53ab50d3-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.831360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.831421 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.831462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.831492 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzgb\" (UniqueName: \"kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.832844 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.834022 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.837820 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.848046 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzgb\" (UniqueName: \"kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb\") pod \"route-controller-manager-5ff8755c47-x4g8q\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:54 crc kubenswrapper[4845]: I0202 10:34:54.869142 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.385495 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" event={"ID":"ac875c91-285a-420b-9065-50af53ab50d3","Type":"ContainerDied","Data":"dca4acc312ecd37056dbc4edd5440def5a4b22eb4ea478d220c28a8e7aa4f810"} Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.386253 4845 scope.go:117] "RemoveContainer" containerID="1fcb883e8f41725d44677baa16381c121b0232bd5110fcb8fbbbde75a1dff506" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.385741 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.415437 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.419641 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-jvc49"] Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.724009 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac875c91-285a-420b-9065-50af53ab50d3" path="/var/lib/kubelet/pods/ac875c91-285a-420b-9065-50af53ab50d3/volumes" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.752264 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.843870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert\") pod \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844016 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles\") pod \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844083 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xzkk\" (UniqueName: \"kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk\") pod \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844127 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config\") pod \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844173 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca\") pod \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\" (UID: \"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd\") " Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844948 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" (UID: "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.844980 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca" (OuterVolumeSpecName: "client-ca") pod "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" (UID: "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.845103 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config" (OuterVolumeSpecName: "config") pod "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" (UID: "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.849946 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" (UID: "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.850400 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk" (OuterVolumeSpecName: "kube-api-access-4xzkk") pod "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" (UID: "2fd9461a-3591-4e69-a9fd-2fd7de4d84cd"). InnerVolumeSpecName "kube-api-access-4xzkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.945665 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.945703 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xzkk\" (UniqueName: \"kubernetes.io/projected/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-kube-api-access-4xzkk\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.945717 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.945725 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:55 crc kubenswrapper[4845]: I0202 10:34:55.945733 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:56 crc kubenswrapper[4845]: I0202 10:34:56.391674 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" event={"ID":"2fd9461a-3591-4e69-a9fd-2fd7de4d84cd","Type":"ContainerDied","Data":"3b0e906a8843ecd4cf6e3849e6d3f269dfbf6989409b6ad8fb10a3cbc5ff1f7a"} Feb 02 10:34:56 crc kubenswrapper[4845]: I0202 10:34:56.391742 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-z9qjh" Feb 02 10:34:56 crc kubenswrapper[4845]: I0202 10:34:56.439208 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:56 crc kubenswrapper[4845]: I0202 10:34:56.441973 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-z9qjh"] Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.168362 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.169042 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.169062 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.169569 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" containerName="controller-manager" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.170278 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.174846 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.175012 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.175116 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.175370 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.176067 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.177332 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.188467 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.196579 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.279192 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.279277 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.279307 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smrkl\" (UniqueName: \"kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.279333 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.279364 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.380603 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.380699 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.380728 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smrkl\" (UniqueName: \"kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.380753 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.380777 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.382747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.383216 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.384060 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.387790 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.396033 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smrkl\" (UniqueName: \"kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl\") pod \"controller-manager-7c58f68dfb-lc8q9\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.398524 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.398680 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mb5dl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-nqqx9_openshift-marketplace(fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.400437 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-nqqx9" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.461082 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.461222 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-55ffd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-srnzq_openshift-marketplace(b3624e54-1097-4ab1-bfff-d7e0f721f8f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:34:57 crc kubenswrapper[4845]: E0202 10:34:57.463134 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-srnzq" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.490214 4845 scope.go:117] "RemoveContainer" containerID="f9b9887e548d54bb14e6cb1ffbf2477aee082f38d18a932c8121b4adcb811fca" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.499706 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.703493 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.720287 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd9461a-3591-4e69-a9fd-2fd7de4d84cd" path="/var/lib/kubelet/pods/2fd9461a-3591-4e69-a9fd-2fd7de4d84cd/volumes" Feb 02 10:34:57 crc kubenswrapper[4845]: I0202 10:34:57.727564 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pmn9h"] Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.049371 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.402305 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" event={"ID":"84cb7b66-62e7-4012-ab80-7c5e6ba51e35","Type":"ContainerStarted","Data":"b972b425867baab7544057f945353ac29e794d6f3f614a0845cb92b6a6f17282"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.402360 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" event={"ID":"84cb7b66-62e7-4012-ab80-7c5e6ba51e35","Type":"ContainerStarted","Data":"b590edabf427ea4389dbc136eecc0e6927f2ff445e18564798b3169338003976"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.406006 4845 generic.go:334] "Generic (PLEG): container finished" podID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerID="3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b" exitCode=0 Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.406082 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerDied","Data":"3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.409719 4845 generic.go:334] "Generic (PLEG): container finished" podID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerID="30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992" exitCode=0 Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.409800 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerDied","Data":"30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.416100 4845 generic.go:334] "Generic (PLEG): container finished" podID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerID="d5ee5a1faa591f8599ea0b16afa6af997338eec66ea0aded98078f3d0d23d736" exitCode=0 Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.416227 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerDied","Data":"d5ee5a1faa591f8599ea0b16afa6af997338eec66ea0aded98078f3d0d23d736"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.427651 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerID="9de0fd1bbbe5a9cccba7aa175bf6868c3db78e2836e4127189cb452f9838b8c4" exitCode=0 Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.428009 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerDied","Data":"9de0fd1bbbe5a9cccba7aa175bf6868c3db78e2836e4127189cb452f9838b8c4"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.430326 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerStarted","Data":"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.437714 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" event={"ID":"fc9b9f1d-2d08-40fa-a147-e57ea489a514","Type":"ContainerStarted","Data":"32b5f5cc2814e9a9f951b6cb1e99fe49b56e048a66aaa5ae91e748c145111771"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.437784 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" event={"ID":"fc9b9f1d-2d08-40fa-a147-e57ea489a514","Type":"ContainerStarted","Data":"f4312bd794368ada2b214c2bbcf464e538e85d6eda329587cacdfdcfaf47a059"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.437879 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.439443 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerStarted","Data":"892e8dde5cc61beb8bab363fa98a836495168d8b933253d6114935636c04fbe1"} Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.445291 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.448282 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" event={"ID":"d9d5f2d7-4523-4243-9439-0b6a0d320578","Type":"ContainerStarted","Data":"1e75a66a1944a3a8fc705151c58070ce86c62b0f80c33f6ac98c68e911d70d1a"} Feb 02 10:34:58 crc kubenswrapper[4845]: E0202 10:34:58.456596 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-nqqx9" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" Feb 02 10:34:58 crc kubenswrapper[4845]: E0202 10:34:58.456951 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-srnzq" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" Feb 02 10:34:58 crc kubenswrapper[4845]: I0202 10:34:58.533056 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" podStartSLOduration=11.533036742 podStartE2EDuration="11.533036742s" podCreationTimestamp="2026-02-02 10:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:58.511594431 +0000 UTC m=+179.602995911" watchObservedRunningTime="2026-02-02 10:34:58.533036742 +0000 UTC m=+179.624438192" Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.153717 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.455759 4845 generic.go:334] "Generic (PLEG): container finished" podID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerID="3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5" exitCode=0 Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.455865 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerDied","Data":"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5"} Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.457853 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pmn9h" event={"ID":"84cb7b66-62e7-4012-ab80-7c5e6ba51e35","Type":"ContainerStarted","Data":"348f0a2b943396c13d038ffcd12f4d08568bdbc6b8be52ba3b708d58bc280eea"} Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.459404 4845 generic.go:334] "Generic (PLEG): container finished" podID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerID="892e8dde5cc61beb8bab363fa98a836495168d8b933253d6114935636c04fbe1" exitCode=0 Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.459446 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerDied","Data":"892e8dde5cc61beb8bab363fa98a836495168d8b933253d6114935636c04fbe1"} Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.460652 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" event={"ID":"d9d5f2d7-4523-4243-9439-0b6a0d320578","Type":"ContainerStarted","Data":"c948c6540acb32b5137bfcbed20acfa6856014e28e025946807e292dedde5f28"} Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.460852 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.465802 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.486363 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" podStartSLOduration=12.486347604 podStartE2EDuration="12.486347604s" podCreationTimestamp="2026-02-02 10:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:59.485210451 +0000 UTC m=+180.576611891" watchObservedRunningTime="2026-02-02 10:34:59.486347604 +0000 UTC m=+180.577749054" Feb 02 10:34:59 crc kubenswrapper[4845]: I0202 10:34:59.525170 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pmn9h" podStartSLOduration=159.525149399 podStartE2EDuration="2m39.525149399s" podCreationTimestamp="2026-02-02 10:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:59.522748569 +0000 UTC m=+180.614150029" watchObservedRunningTime="2026-02-02 10:34:59.525149399 +0000 UTC m=+180.616550849" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.476126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerStarted","Data":"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0"} Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.481473 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerStarted","Data":"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb"} Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.484387 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerStarted","Data":"2e01706b1e2ad6c4ace7e0a9ee3b107787a005ceeb1be7e17a00e65ea223d90e"} Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.487126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerStarted","Data":"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c"} Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.490355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerStarted","Data":"692c027aeff58a06f776073818a159fb8a42b2840f14b77a4b11565409dfc3c8"} Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.498947 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7pz84" podStartSLOduration=4.321369343 podStartE2EDuration="33.498923084s" podCreationTimestamp="2026-02-02 10:34:27 +0000 UTC" firstStartedPulling="2026-02-02 10:34:31.014573281 +0000 UTC m=+152.105974731" lastFinishedPulling="2026-02-02 10:35:00.192127022 +0000 UTC m=+181.283528472" observedRunningTime="2026-02-02 10:35:00.494708972 +0000 UTC m=+181.586110432" watchObservedRunningTime="2026-02-02 10:35:00.498923084 +0000 UTC m=+181.590324534" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.521718 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-89bwt" podStartSLOduration=2.646582586 podStartE2EDuration="30.521689114s" podCreationTimestamp="2026-02-02 10:34:30 +0000 UTC" firstStartedPulling="2026-02-02 10:34:32.129990782 +0000 UTC m=+153.221392232" lastFinishedPulling="2026-02-02 10:35:00.00509731 +0000 UTC m=+181.096498760" observedRunningTime="2026-02-02 10:35:00.515083872 +0000 UTC m=+181.606485322" watchObservedRunningTime="2026-02-02 10:35:00.521689114 +0000 UTC m=+181.613090564" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.535625 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xf5wp" podStartSLOduration=3.549269035 podStartE2EDuration="31.535602807s" podCreationTimestamp="2026-02-02 10:34:29 +0000 UTC" firstStartedPulling="2026-02-02 10:34:32.123391661 +0000 UTC m=+153.214793111" lastFinishedPulling="2026-02-02 10:35:00.109725433 +0000 UTC m=+181.201126883" observedRunningTime="2026-02-02 10:35:00.532297441 +0000 UTC m=+181.623698891" watchObservedRunningTime="2026-02-02 10:35:00.535602807 +0000 UTC m=+181.627004257" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.550547 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-566gk" podStartSLOduration=4.515709147 podStartE2EDuration="33.55052702s" podCreationTimestamp="2026-02-02 10:34:27 +0000 UTC" firstStartedPulling="2026-02-02 10:34:30.992543773 +0000 UTC m=+152.083945233" lastFinishedPulling="2026-02-02 10:35:00.027361616 +0000 UTC m=+181.118763106" observedRunningTime="2026-02-02 10:35:00.548944414 +0000 UTC m=+181.640345884" watchObservedRunningTime="2026-02-02 10:35:00.55052702 +0000 UTC m=+181.641928470" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.576728 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jxk8q" podStartSLOduration=2.455867178 podStartE2EDuration="30.576706589s" podCreationTimestamp="2026-02-02 10:34:30 +0000 UTC" firstStartedPulling="2026-02-02 10:34:32.12476249 +0000 UTC m=+153.216163940" lastFinishedPulling="2026-02-02 10:35:00.245601901 +0000 UTC m=+181.337003351" observedRunningTime="2026-02-02 10:35:00.574949708 +0000 UTC m=+181.666351158" watchObservedRunningTime="2026-02-02 10:35:00.576706589 +0000 UTC m=+181.668108049" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.809111 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.809187 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.932417 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:00 crc kubenswrapper[4845]: I0202 10:35:00.932464 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:01 crc kubenswrapper[4845]: I0202 10:35:01.496399 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerStarted","Data":"bbcd1edbd954385c8b6a4defc894e861ef2b8ac66447cad7befe95fddedb8599"} Feb 02 10:35:01 crc kubenswrapper[4845]: I0202 10:35:01.518311 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nb8f9" podStartSLOduration=3.2057943939999998 podStartE2EDuration="31.518297141s" podCreationTimestamp="2026-02-02 10:34:30 +0000 UTC" firstStartedPulling="2026-02-02 10:34:32.132996119 +0000 UTC m=+153.224397569" lastFinishedPulling="2026-02-02 10:35:00.445498866 +0000 UTC m=+181.536900316" observedRunningTime="2026-02-02 10:35:01.516310353 +0000 UTC m=+182.607711813" watchObservedRunningTime="2026-02-02 10:35:01.518297141 +0000 UTC m=+182.609698581" Feb 02 10:35:01 crc kubenswrapper[4845]: I0202 10:35:01.978853 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jxk8q" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="registry-server" probeResult="failure" output=< Feb 02 10:35:01 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 10:35:01 crc kubenswrapper[4845]: > Feb 02 10:35:01 crc kubenswrapper[4845]: I0202 10:35:01.980753 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-89bwt" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="registry-server" probeResult="failure" output=< Feb 02 10:35:01 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 10:35:01 crc kubenswrapper[4845]: > Feb 02 10:35:03 crc kubenswrapper[4845]: I0202 10:35:03.045151 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s7qmj" Feb 02 10:35:06 crc kubenswrapper[4845]: I0202 10:35:06.943408 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:35:06 crc kubenswrapper[4845]: I0202 10:35:06.943808 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerName="controller-manager" containerID="cri-o://c948c6540acb32b5137bfcbed20acfa6856014e28e025946807e292dedde5f28" gracePeriod=30 Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.049825 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.050043 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" podUID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" containerName="route-controller-manager" containerID="cri-o://32b5f5cc2814e9a9f951b6cb1e99fe49b56e048a66aaa5ae91e748c145111771" gracePeriod=30 Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.500992 4845 patch_prober.go:28] interesting pod/controller-manager-7c58f68dfb-lc8q9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.501355 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.904518 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.904571 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:07 crc kubenswrapper[4845]: I0202 10:35:07.953159 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.074336 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.174996 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.175045 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.236936 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.539562 4845 generic.go:334] "Generic (PLEG): container finished" podID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerID="c948c6540acb32b5137bfcbed20acfa6856014e28e025946807e292dedde5f28" exitCode=0 Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.539647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" event={"ID":"d9d5f2d7-4523-4243-9439-0b6a0d320578","Type":"ContainerDied","Data":"c948c6540acb32b5137bfcbed20acfa6856014e28e025946807e292dedde5f28"} Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.541478 4845 generic.go:334] "Generic (PLEG): container finished" podID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" containerID="32b5f5cc2814e9a9f951b6cb1e99fe49b56e048a66aaa5ae91e748c145111771" exitCode=0 Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.542369 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" event={"ID":"fc9b9f1d-2d08-40fa-a147-e57ea489a514","Type":"ContainerDied","Data":"32b5f5cc2814e9a9f951b6cb1e99fe49b56e048a66aaa5ae91e748c145111771"} Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.585789 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.598295 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.746715 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.752097 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.778223 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:08 crc kubenswrapper[4845]: E0202 10:35:08.778468 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerName="controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.778481 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerName="controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: E0202 10:35:08.778494 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" containerName="route-controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.778501 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" containerName="route-controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.778648 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" containerName="controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.778667 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" containerName="route-controller-manager" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.780362 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.783317 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833177 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert\") pod \"d9d5f2d7-4523-4243-9439-0b6a0d320578\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833235 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config\") pod \"d9d5f2d7-4523-4243-9439-0b6a0d320578\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833277 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles\") pod \"d9d5f2d7-4523-4243-9439-0b6a0d320578\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833304 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config\") pod \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833323 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca\") pod \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833372 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxzgb\" (UniqueName: \"kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb\") pod \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833397 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smrkl\" (UniqueName: \"kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl\") pod \"d9d5f2d7-4523-4243-9439-0b6a0d320578\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833428 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert\") pod \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\" (UID: \"fc9b9f1d-2d08-40fa-a147-e57ea489a514\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.833443 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca\") pod \"d9d5f2d7-4523-4243-9439-0b6a0d320578\" (UID: \"d9d5f2d7-4523-4243-9439-0b6a0d320578\") " Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.834032 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d9d5f2d7-4523-4243-9439-0b6a0d320578" (UID: "d9d5f2d7-4523-4243-9439-0b6a0d320578"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.834063 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config" (OuterVolumeSpecName: "config") pod "d9d5f2d7-4523-4243-9439-0b6a0d320578" (UID: "d9d5f2d7-4523-4243-9439-0b6a0d320578"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.834045 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9d5f2d7-4523-4243-9439-0b6a0d320578" (UID: "d9d5f2d7-4523-4243-9439-0b6a0d320578"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.834160 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca" (OuterVolumeSpecName: "client-ca") pod "fc9b9f1d-2d08-40fa-a147-e57ea489a514" (UID: "fc9b9f1d-2d08-40fa-a147-e57ea489a514"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.834209 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config" (OuterVolumeSpecName: "config") pod "fc9b9f1d-2d08-40fa-a147-e57ea489a514" (UID: "fc9b9f1d-2d08-40fa-a147-e57ea489a514"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.838753 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9d5f2d7-4523-4243-9439-0b6a0d320578" (UID: "d9d5f2d7-4523-4243-9439-0b6a0d320578"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.838832 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb" (OuterVolumeSpecName: "kube-api-access-zxzgb") pod "fc9b9f1d-2d08-40fa-a147-e57ea489a514" (UID: "fc9b9f1d-2d08-40fa-a147-e57ea489a514"). InnerVolumeSpecName "kube-api-access-zxzgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.843605 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fc9b9f1d-2d08-40fa-a147-e57ea489a514" (UID: "fc9b9f1d-2d08-40fa-a147-e57ea489a514"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.847185 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl" (OuterVolumeSpecName: "kube-api-access-smrkl") pod "d9d5f2d7-4523-4243-9439-0b6a0d320578" (UID: "d9d5f2d7-4523-4243-9439-0b6a0d320578"). InnerVolumeSpecName "kube-api-access-smrkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934373 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934424 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2czr9\" (UniqueName: \"kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934634 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934706 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934775 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934951 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934976 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc9b9f1d-2d08-40fa-a147-e57ea489a514-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.934990 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxzgb\" (UniqueName: \"kubernetes.io/projected/fc9b9f1d-2d08-40fa-a147-e57ea489a514-kube-api-access-zxzgb\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935005 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smrkl\" (UniqueName: \"kubernetes.io/projected/d9d5f2d7-4523-4243-9439-0b6a0d320578-kube-api-access-smrkl\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935017 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9b9f1d-2d08-40fa-a147-e57ea489a514-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935029 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935052 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9d5f2d7-4523-4243-9439-0b6a0d320578-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935064 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:08 crc kubenswrapper[4845]: I0202 10:35:08.935078 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d9d5f2d7-4523-4243-9439-0b6a0d320578-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.035672 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.036682 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.036711 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.036759 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.036778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2czr9\" (UniqueName: \"kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.037023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.037496 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.039929 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.041902 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.052328 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2czr9\" (UniqueName: \"kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9\") pod \"controller-manager-785795fb9f-wz4dw\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.107609 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.549369 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" event={"ID":"fc9b9f1d-2d08-40fa-a147-e57ea489a514","Type":"ContainerDied","Data":"f4312bd794368ada2b214c2bbcf464e538e85d6eda329587cacdfdcfaf47a059"} Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.549700 4845 scope.go:117] "RemoveContainer" containerID="32b5f5cc2814e9a9f951b6cb1e99fe49b56e048a66aaa5ae91e748c145111771" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.549911 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.557350 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" event={"ID":"d9d5f2d7-4523-4243-9439-0b6a0d320578","Type":"ContainerDied","Data":"1e75a66a1944a3a8fc705151c58070ce86c62b0f80c33f6ac98c68e911d70d1a"} Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.557431 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.597196 4845 scope.go:117] "RemoveContainer" containerID="c948c6540acb32b5137bfcbed20acfa6856014e28e025946807e292dedde5f28" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.597246 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.599994 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.602967 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.605452 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5ff8755c47-x4g8q"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.617725 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.621652 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c58f68dfb-lc8q9"] Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.731725 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d5f2d7-4523-4243-9439-0b6a0d320578" path="/var/lib/kubelet/pods/d9d5f2d7-4523-4243-9439-0b6a0d320578/volumes" Feb 02 10:35:09 crc kubenswrapper[4845]: I0202 10:35:09.732370 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9b9f1d-2d08-40fa-a147-e57ea489a514" path="/var/lib/kubelet/pods/fc9b9f1d-2d08-40fa-a147-e57ea489a514/volumes" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.094019 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.095067 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.097197 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.097366 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.104762 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.251291 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.251382 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.352462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.352554 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.352631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.370944 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.410319 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.482318 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.482677 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.566399 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" event={"ID":"5462ffc5-3458-4343-90de-625a307d56d0","Type":"ContainerStarted","Data":"e682377b37f912142b25f90537ee5505ca0068713e5d79fc624dba607d356d8e"} Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.566450 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" event={"ID":"5462ffc5-3458-4343-90de-625a307d56d0","Type":"ContainerStarted","Data":"24a79b6e178f4c50f852ca424030cfff6044bf1a543bc4a2ad28acaa9dd3e5e1"} Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.566708 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.568017 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.571218 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.571262 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.576619 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.580427 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-566gk" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="registry-server" containerID="cri-o://599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c" gracePeriod=2 Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.623858 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" podStartSLOduration=4.623843664 podStartE2EDuration="4.623843664s" podCreationTimestamp="2026-02-02 10:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:10.596829141 +0000 UTC m=+191.688230611" watchObservedRunningTime="2026-02-02 10:35:10.623843664 +0000 UTC m=+191.715245114" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.632751 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.642971 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.646225 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.854205 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.894646 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.970115 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.977837 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.988482 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:35:10 crc kubenswrapper[4845]: I0202 10:35:10.988723 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7pz84" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="registry-server" containerID="cri-o://e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0" gracePeriod=2 Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.006777 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.066004 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s42p\" (UniqueName: \"kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p\") pod \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.066140 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities\") pod \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.066197 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content\") pod \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\" (UID: \"128f32ab-e2ce-4468-a7e8-bc84aa2bb275\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.067251 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities" (OuterVolumeSpecName: "utilities") pod "128f32ab-e2ce-4468-a7e8-bc84aa2bb275" (UID: "128f32ab-e2ce-4468-a7e8-bc84aa2bb275"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.072435 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p" (OuterVolumeSpecName: "kube-api-access-5s42p") pod "128f32ab-e2ce-4468-a7e8-bc84aa2bb275" (UID: "128f32ab-e2ce-4468-a7e8-bc84aa2bb275"). InnerVolumeSpecName "kube-api-access-5s42p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.117463 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128f32ab-e2ce-4468-a7e8-bc84aa2bb275" (UID: "128f32ab-e2ce-4468-a7e8-bc84aa2bb275"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.167104 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s42p\" (UniqueName: \"kubernetes.io/projected/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-kube-api-access-5s42p\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.167143 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.167155 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128f32ab-e2ce-4468-a7e8-bc84aa2bb275-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.192296 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.192763 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="extract-content" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.192774 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="extract-content" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.192785 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="extract-utilities" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.192791 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="extract-utilities" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.192800 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="registry-server" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.192806 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="registry-server" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.192911 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerName="registry-server" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.193326 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.195048 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.195083 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.195048 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.195202 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.195991 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.198669 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.200323 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.268685 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.268819 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt9f\" (UniqueName: \"kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.268869 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.268924 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.346529 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.369754 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt9f\" (UniqueName: \"kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.369813 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.369841 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.369909 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.371556 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.371648 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.374514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.396506 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt9f\" (UniqueName: \"kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f\") pod \"route-controller-manager-5f8f98956b-pr9zj\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.470668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities\") pod \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.470723 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvpsx\" (UniqueName: \"kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx\") pod \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.470773 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content\") pod \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\" (UID: \"149fcd2d-91c2-493a-a1ec-c8675e1901ef\") " Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.471396 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities" (OuterVolumeSpecName: "utilities") pod "149fcd2d-91c2-493a-a1ec-c8675e1901ef" (UID: "149fcd2d-91c2-493a-a1ec-c8675e1901ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.473121 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx" (OuterVolumeSpecName: "kube-api-access-nvpsx") pod "149fcd2d-91c2-493a-a1ec-c8675e1901ef" (UID: "149fcd2d-91c2-493a-a1ec-c8675e1901ef"). InnerVolumeSpecName "kube-api-access-nvpsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.510433 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.521017 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "149fcd2d-91c2-493a-a1ec-c8675e1901ef" (UID: "149fcd2d-91c2-493a-a1ec-c8675e1901ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.571797 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.571849 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvpsx\" (UniqueName: \"kubernetes.io/projected/149fcd2d-91c2-493a-a1ec-c8675e1901ef-kube-api-access-nvpsx\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.571868 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149fcd2d-91c2-493a-a1ec-c8675e1901ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.587410 4845 generic.go:334] "Generic (PLEG): container finished" podID="e429cb8e-ac68-4769-8144-bd170eb88425" containerID="67bd6a8667f39a3ae555add49d53eca4ccef6d7bd5ef20691af6315ff909ac04" exitCode=0 Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.587568 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e429cb8e-ac68-4769-8144-bd170eb88425","Type":"ContainerDied","Data":"67bd6a8667f39a3ae555add49d53eca4ccef6d7bd5ef20691af6315ff909ac04"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.587820 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e429cb8e-ac68-4769-8144-bd170eb88425","Type":"ContainerStarted","Data":"0c7a8ca8d336bdc0a77f701f2799e13b55a207b5da77a1accc1bfdd66da2f277"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.590978 4845 generic.go:334] "Generic (PLEG): container finished" podID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" containerID="599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c" exitCode=0 Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.591043 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerDied","Data":"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.591057 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-566gk" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.591069 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-566gk" event={"ID":"128f32ab-e2ce-4468-a7e8-bc84aa2bb275","Type":"ContainerDied","Data":"9bcdab2008f0c99412111dda17b5eb250f02325e2f7935a2aadaa0f85ebf0c92"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.591085 4845 scope.go:117] "RemoveContainer" containerID="599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.594053 4845 generic.go:334] "Generic (PLEG): container finished" podID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerID="e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0" exitCode=0 Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.594803 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7pz84" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.598982 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerDied","Data":"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.599041 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7pz84" event={"ID":"149fcd2d-91c2-493a-a1ec-c8675e1901ef","Type":"ContainerDied","Data":"9c90348dba10073c8c8427f15a5626e4e8ea2c366d1f13b9219eb720f44735c2"} Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.682053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.694833 4845 scope.go:117] "RemoveContainer" containerID="3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.696862 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.710590 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7pz84"] Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.746356 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" path="/var/lib/kubelet/pods/149fcd2d-91c2-493a-a1ec-c8675e1901ef/volumes" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.747100 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.747142 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-566gk"] Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.775560 4845 scope.go:117] "RemoveContainer" containerID="79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.793088 4845 scope.go:117] "RemoveContainer" containerID="599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.794562 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c\": container with ID starting with 599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c not found: ID does not exist" containerID="599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.794595 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c"} err="failed to get container status \"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c\": rpc error: code = NotFound desc = could not find container \"599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c\": container with ID starting with 599b1bff5deca85c0c8cef3c3345030830c2da69f15c44de71241ffac78ed91c not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.794636 4845 scope.go:117] "RemoveContainer" containerID="3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.795006 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b\": container with ID starting with 3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b not found: ID does not exist" containerID="3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.795029 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b"} err="failed to get container status \"3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b\": rpc error: code = NotFound desc = could not find container \"3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b\": container with ID starting with 3ebe20a234144bc7c881d5b70747b9174e6d6c3731b82fc0bc418fa86c82763b not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.795048 4845 scope.go:117] "RemoveContainer" containerID="79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.795498 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822\": container with ID starting with 79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822 not found: ID does not exist" containerID="79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.795523 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822"} err="failed to get container status \"79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822\": rpc error: code = NotFound desc = could not find container \"79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822\": container with ID starting with 79beba806d11e77da768b00372c982dccf28936f17f8b1718101665cef165822 not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.795542 4845 scope.go:117] "RemoveContainer" containerID="e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.811216 4845 scope.go:117] "RemoveContainer" containerID="30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.827624 4845 scope.go:117] "RemoveContainer" containerID="4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.839751 4845 scope.go:117] "RemoveContainer" containerID="e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.840087 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0\": container with ID starting with e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0 not found: ID does not exist" containerID="e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.840119 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0"} err="failed to get container status \"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0\": rpc error: code = NotFound desc = could not find container \"e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0\": container with ID starting with e870b0d19662252b4f303486b370da820a85f86eaa5dc9f096cd370e72f8bdf0 not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.840141 4845 scope.go:117] "RemoveContainer" containerID="30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.840483 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992\": container with ID starting with 30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992 not found: ID does not exist" containerID="30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.840503 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992"} err="failed to get container status \"30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992\": rpc error: code = NotFound desc = could not find container \"30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992\": container with ID starting with 30f2480599158ff15839a78a41e3c8ce5042e659014fa22c57bffdb6f66dd992 not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.840518 4845 scope.go:117] "RemoveContainer" containerID="4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358" Feb 02 10:35:11 crc kubenswrapper[4845]: E0202 10:35:11.840825 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358\": container with ID starting with 4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358 not found: ID does not exist" containerID="4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.840845 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358"} err="failed to get container status \"4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358\": rpc error: code = NotFound desc = could not find container \"4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358\": container with ID starting with 4ea4bf24b9001d3db5017954e52445a32b7c139a1e3f3a14c2209147987ee358 not found: ID does not exist" Feb 02 10:35:11 crc kubenswrapper[4845]: I0202 10:35:11.962839 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:11 crc kubenswrapper[4845]: W0202 10:35:11.969789 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d5d1c9b_bf0d_4e5f_b70c_5460f17e53a9.slice/crio-edd76338c4cf9565b4d4c3ff6adff77448c36e6ba5ba8384ccd4a0a7e5b2dd8a WatchSource:0}: Error finding container edd76338c4cf9565b4d4c3ff6adff77448c36e6ba5ba8384ccd4a0a7e5b2dd8a: Status 404 returned error can't find the container with id edd76338c4cf9565b4d4c3ff6adff77448c36e6ba5ba8384ccd4a0a7e5b2dd8a Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.601477 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" event={"ID":"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9","Type":"ContainerStarted","Data":"d824c9ea9648f75ba557aac1840de3f2484a4a72bc9492dd547ef7324c384bec"} Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.601840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" event={"ID":"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9","Type":"ContainerStarted","Data":"edd76338c4cf9565b4d4c3ff6adff77448c36e6ba5ba8384ccd4a0a7e5b2dd8a"} Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.603077 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.613633 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.619558 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" podStartSLOduration=5.619484269 podStartE2EDuration="5.619484269s" podCreationTimestamp="2026-02-02 10:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:12.617841751 +0000 UTC m=+193.709243201" watchObservedRunningTime="2026-02-02 10:35:12.619484269 +0000 UTC m=+193.710885709" Feb 02 10:35:12 crc kubenswrapper[4845]: I0202 10:35:12.984319 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.089174 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir\") pod \"e429cb8e-ac68-4769-8144-bd170eb88425\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.089239 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access\") pod \"e429cb8e-ac68-4769-8144-bd170eb88425\" (UID: \"e429cb8e-ac68-4769-8144-bd170eb88425\") " Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.089484 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e429cb8e-ac68-4769-8144-bd170eb88425" (UID: "e429cb8e-ac68-4769-8144-bd170eb88425"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.093799 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e429cb8e-ac68-4769-8144-bd170eb88425" (UID: "e429cb8e-ac68-4769-8144-bd170eb88425"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.190546 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e429cb8e-ac68-4769-8144-bd170eb88425-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.190591 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e429cb8e-ac68-4769-8144-bd170eb88425-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.386674 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.386953 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-89bwt" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="registry-server" containerID="cri-o://2e01706b1e2ad6c4ace7e0a9ee3b107787a005ceeb1be7e17a00e65ea223d90e" gracePeriod=2 Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.627858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"e429cb8e-ac68-4769-8144-bd170eb88425","Type":"ContainerDied","Data":"0c7a8ca8d336bdc0a77f701f2799e13b55a207b5da77a1accc1bfdd66da2f277"} Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.627910 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c7a8ca8d336bdc0a77f701f2799e13b55a207b5da77a1accc1bfdd66da2f277" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.627960 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.631741 4845 generic.go:334] "Generic (PLEG): container finished" podID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerID="2e01706b1e2ad6c4ace7e0a9ee3b107787a005ceeb1be7e17a00e65ea223d90e" exitCode=0 Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.631813 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerDied","Data":"2e01706b1e2ad6c4ace7e0a9ee3b107787a005ceeb1be7e17a00e65ea223d90e"} Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.724219 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128f32ab-e2ce-4468-a7e8-bc84aa2bb275" path="/var/lib/kubelet/pods/128f32ab-e2ce-4468-a7e8-bc84aa2bb275/volumes" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.815341 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.898045 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content\") pod \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.898108 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7f2m\" (UniqueName: \"kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m\") pod \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.898188 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities\") pod \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\" (UID: \"346c427b-6ed6-4bac-ae1f-ee2400ab6884\") " Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.898948 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities" (OuterVolumeSpecName: "utilities") pod "346c427b-6ed6-4bac-ae1f-ee2400ab6884" (UID: "346c427b-6ed6-4bac-ae1f-ee2400ab6884"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.906125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m" (OuterVolumeSpecName: "kube-api-access-p7f2m") pod "346c427b-6ed6-4bac-ae1f-ee2400ab6884" (UID: "346c427b-6ed6-4bac-ae1f-ee2400ab6884"). InnerVolumeSpecName "kube-api-access-p7f2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.931486 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "346c427b-6ed6-4bac-ae1f-ee2400ab6884" (UID: "346c427b-6ed6-4bac-ae1f-ee2400ab6884"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.984863 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.985100 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jxk8q" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="registry-server" containerID="cri-o://2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb" gracePeriod=2 Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.999582 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.999618 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7f2m\" (UniqueName: \"kubernetes.io/projected/346c427b-6ed6-4bac-ae1f-ee2400ab6884-kube-api-access-p7f2m\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:13 crc kubenswrapper[4845]: I0202 10:35:13.999629 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/346c427b-6ed6-4bac-ae1f-ee2400ab6884-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.439672 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.505517 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content\") pod \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.505734 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-928mf\" (UniqueName: \"kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf\") pod \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.505825 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities\") pod \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\" (UID: \"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd\") " Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.506882 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities" (OuterVolumeSpecName: "utilities") pod "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" (UID: "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.512086 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf" (OuterVolumeSpecName: "kube-api-access-928mf") pod "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" (UID: "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd"). InnerVolumeSpecName "kube-api-access-928mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.607262 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-928mf\" (UniqueName: \"kubernetes.io/projected/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-kube-api-access-928mf\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.607545 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.638939 4845 generic.go:334] "Generic (PLEG): container finished" podID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerID="2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb" exitCode=0 Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.639017 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerDied","Data":"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb"} Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.639050 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxk8q" event={"ID":"1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd","Type":"ContainerDied","Data":"45e45e9dbbedd64142e2fefb82cba902c15048017b203f1719c5975bd5a5ec46"} Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.639068 4845 scope.go:117] "RemoveContainer" containerID="2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.639116 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxk8q" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.642024 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-89bwt" event={"ID":"346c427b-6ed6-4bac-ae1f-ee2400ab6884","Type":"ContainerDied","Data":"6eeeecc6f7635724fe5c5de24d8b0a5978478aedd321e7117013fb83d5e17bed"} Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.642142 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-89bwt" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.651388 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerStarted","Data":"5a24fa9c118523cf61e0979a46080c1c45d6df89bfac951d244d917947874a3c"} Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.658477 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" (UID: "1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.659554 4845 scope.go:117] "RemoveContainer" containerID="3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.662749 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerStarted","Data":"b1fd42c06508a5f7cc77e759a154a4b98880587d65c3430e3b9c376f3441f3e5"} Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.695561 4845 scope.go:117] "RemoveContainer" containerID="f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.708624 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.734061 4845 scope.go:117] "RemoveContainer" containerID="2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb" Feb 02 10:35:14 crc kubenswrapper[4845]: E0202 10:35:14.734630 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb\": container with ID starting with 2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb not found: ID does not exist" containerID="2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.734684 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb"} err="failed to get container status \"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb\": rpc error: code = NotFound desc = could not find container \"2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb\": container with ID starting with 2cec9c5daa4e3c700a159a8ca737ab0f93ab3b9ee5094f6f92b629949b0337fb not found: ID does not exist" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.734714 4845 scope.go:117] "RemoveContainer" containerID="3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5" Feb 02 10:35:14 crc kubenswrapper[4845]: E0202 10:35:14.735009 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5\": container with ID starting with 3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5 not found: ID does not exist" containerID="3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.735094 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5"} err="failed to get container status \"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5\": rpc error: code = NotFound desc = could not find container \"3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5\": container with ID starting with 3e41e91374d41dc9396251c1a3103698cf4930fd702dd17f19478e9e82fee7f5 not found: ID does not exist" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.735177 4845 scope.go:117] "RemoveContainer" containerID="f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175" Feb 02 10:35:14 crc kubenswrapper[4845]: E0202 10:35:14.735975 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175\": container with ID starting with f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175 not found: ID does not exist" containerID="f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.736083 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175"} err="failed to get container status \"f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175\": rpc error: code = NotFound desc = could not find container \"f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175\": container with ID starting with f3f753184491872d9d76899a0dbcf4380da09a3a8edc986931923c603f6e9175 not found: ID does not exist" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.736159 4845 scope.go:117] "RemoveContainer" containerID="2e01706b1e2ad6c4ace7e0a9ee3b107787a005ceeb1be7e17a00e65ea223d90e" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.744005 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.749249 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-89bwt"] Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.750632 4845 scope.go:117] "RemoveContainer" containerID="d5ee5a1faa591f8599ea0b16afa6af997338eec66ea0aded98078f3d0d23d736" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.769518 4845 scope.go:117] "RemoveContainer" containerID="ac8e195b0cbc3f127a71603eaf254c9928ef5832347572bb602ee62efcc4964c" Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.964004 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:35:14 crc kubenswrapper[4845]: I0202 10:35:14.969179 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jxk8q"] Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.675931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerDied","Data":"b1fd42c06508a5f7cc77e759a154a4b98880587d65c3430e3b9c376f3441f3e5"} Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.675931 4845 generic.go:334] "Generic (PLEG): container finished" podID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerID="b1fd42c06508a5f7cc77e759a154a4b98880587d65c3430e3b9c376f3441f3e5" exitCode=0 Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.688542 4845 generic.go:334] "Generic (PLEG): container finished" podID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerID="5a24fa9c118523cf61e0979a46080c1c45d6df89bfac951d244d917947874a3c" exitCode=0 Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.688579 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerDied","Data":"5a24fa9c118523cf61e0979a46080c1c45d6df89bfac951d244d917947874a3c"} Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.719455 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" path="/var/lib/kubelet/pods/1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd/volumes" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.720294 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" path="/var/lib/kubelet/pods/346c427b-6ed6-4bac-ae1f-ee2400ab6884/volumes" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897211 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897446 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897458 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897471 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897477 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897485 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897491 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897497 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897502 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="extract-utilities" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897511 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897516 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897522 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e429cb8e-ac68-4769-8144-bd170eb88425" containerName="pruner" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897528 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e429cb8e-ac68-4769-8144-bd170eb88425" containerName="pruner" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897536 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897543 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897550 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897555 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="extract-content" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897561 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897567 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: E0202 10:35:15.897578 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897584 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897670 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="346c427b-6ed6-4bac-ae1f-ee2400ab6884" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897683 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e429cb8e-ac68-4769-8144-bd170eb88425" containerName="pruner" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897690 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1579ee3d-0fc7-456f-a78a-eb18aa7bf2bd" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.897696 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="149fcd2d-91c2-493a-a1ec-c8675e1901ef" containerName="registry-server" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.898048 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.903251 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.903972 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.904839 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.923686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.923772 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:15 crc kubenswrapper[4845]: I0202 10:35:15.923830 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.025450 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.025516 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.025548 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.025625 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.025645 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.043152 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.226742 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.241549 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.241620 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.695195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerStarted","Data":"68d3140ebd20281e9eddc75449b750e9027bcd03b8ff2f48c1cab9686d75572d"} Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.699072 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerStarted","Data":"5769d4596805c8a147c91069eb2109528eac8714acdda17a418b761289524bf3"} Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.717878 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.724167 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-srnzq" podStartSLOduration=3.354625771 podStartE2EDuration="49.724151002s" podCreationTimestamp="2026-02-02 10:34:27 +0000 UTC" firstStartedPulling="2026-02-02 10:34:29.815042252 +0000 UTC m=+150.906443702" lastFinishedPulling="2026-02-02 10:35:16.184567483 +0000 UTC m=+197.275968933" observedRunningTime="2026-02-02 10:35:16.714138962 +0000 UTC m=+197.805540412" watchObservedRunningTime="2026-02-02 10:35:16.724151002 +0000 UTC m=+197.815552452" Feb 02 10:35:16 crc kubenswrapper[4845]: I0202 10:35:16.744096 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nqqx9" podStartSLOduration=4.700364839 podStartE2EDuration="49.74407326s" podCreationTimestamp="2026-02-02 10:34:27 +0000 UTC" firstStartedPulling="2026-02-02 10:34:31.006523488 +0000 UTC m=+152.097924938" lastFinishedPulling="2026-02-02 10:35:16.050231909 +0000 UTC m=+197.141633359" observedRunningTime="2026-02-02 10:35:16.74199138 +0000 UTC m=+197.833392840" watchObservedRunningTime="2026-02-02 10:35:16.74407326 +0000 UTC m=+197.835474710" Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.516068 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.516131 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.705763 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"21f3600b-d9f0-4d8e-8a5a-2b03161164b4","Type":"ContainerStarted","Data":"706922acec942089a8fd3b03c826b07b1a0b641cf0a3415a88c290fcbe0bd620"} Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.705831 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"21f3600b-d9f0-4d8e-8a5a-2b03161164b4","Type":"ContainerStarted","Data":"ae045aee676ac5379e234ac23abc2cda69a140b263c32736f6d478088a0d6ce2"} Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.724312 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.724296029 podStartE2EDuration="2.724296029s" podCreationTimestamp="2026-02-02 10:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:17.723328061 +0000 UTC m=+198.814729501" watchObservedRunningTime="2026-02-02 10:35:17.724296029 +0000 UTC m=+198.815697479" Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.739170 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:35:17 crc kubenswrapper[4845]: I0202 10:35:17.739464 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:35:18 crc kubenswrapper[4845]: I0202 10:35:18.575779 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:35:18 crc kubenswrapper[4845]: I0202 10:35:18.778404 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nqqx9" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="registry-server" probeResult="failure" output=< Feb 02 10:35:18 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 10:35:18 crc kubenswrapper[4845]: > Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.176022 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" containerName="oauth-openshift" containerID="cri-o://93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e" gracePeriod=15 Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.667661 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.702466 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd"] Feb 02 10:35:24 crc kubenswrapper[4845]: E0202 10:35:24.702696 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" containerName="oauth-openshift" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.702708 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" containerName="oauth-openshift" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.702815 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" containerName="oauth-openshift" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.703202 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.717452 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd"] Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739451 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739554 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739590 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739623 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739656 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739688 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739723 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739809 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739851 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739917 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.739981 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740011 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6x5c\" (UniqueName: \"kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740045 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle\") pod \"98d47741-7063-487f-a38b-b9c398f3e07e\" (UID: \"98d47741-7063-487f-a38b-b9c398f3e07e\") " Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740259 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-dir\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740304 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgvqw\" (UniqueName: \"kubernetes.io/projected/34a4ebb3-154b-43c2-8566-8a638df7ecdf-kube-api-access-mgvqw\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740333 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740365 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740402 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-login\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740443 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740489 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-session\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740525 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740564 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740644 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-policies\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740674 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-error\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.740714 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.742230 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.742251 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.742386 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.742393 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.742536 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.750477 4845 generic.go:334] "Generic (PLEG): container finished" podID="98d47741-7063-487f-a38b-b9c398f3e07e" containerID="93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e" exitCode=0 Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.750547 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" event={"ID":"98d47741-7063-487f-a38b-b9c398f3e07e","Type":"ContainerDied","Data":"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e"} Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.750560 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.750584 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-w989s" event={"ID":"98d47741-7063-487f-a38b-b9c398f3e07e","Type":"ContainerDied","Data":"7e70d90a9fcf67e25e1301d6daeb2cdf5956e3a601c854abda0627c2816b60da"} Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.750604 4845 scope.go:117] "RemoveContainer" containerID="93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.751265 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.752000 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.752268 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c" (OuterVolumeSpecName: "kube-api-access-c6x5c") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "kube-api-access-c6x5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.752583 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.752900 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.753969 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.754152 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.754619 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.760324 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "98d47741-7063-487f-a38b-b9c398f3e07e" (UID: "98d47741-7063-487f-a38b-b9c398f3e07e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.794034 4845 scope.go:117] "RemoveContainer" containerID="93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e" Feb 02 10:35:24 crc kubenswrapper[4845]: E0202 10:35:24.794542 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e\": container with ID starting with 93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e not found: ID does not exist" containerID="93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.794589 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e"} err="failed to get container status \"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e\": rpc error: code = NotFound desc = could not find container \"93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e\": container with ID starting with 93e37fe117c0972dc087595f3fe9a911fd4435d1bda815a2b2a543564e02392e not found: ID does not exist" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.841809 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgvqw\" (UniqueName: \"kubernetes.io/projected/34a4ebb3-154b-43c2-8566-8a638df7ecdf-kube-api-access-mgvqw\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.841862 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.841951 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.841985 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-login\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842012 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-session\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842058 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842085 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842104 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-policies\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842166 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-error\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842201 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842228 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842251 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-dir\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842290 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842300 4845 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842311 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842322 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842333 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842343 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842351 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842360 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842370 4845 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98d47741-7063-487f-a38b-b9c398f3e07e-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842379 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842388 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842398 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842407 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6x5c\" (UniqueName: \"kubernetes.io/projected/98d47741-7063-487f-a38b-b9c398f3e07e-kube-api-access-c6x5c\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842415 4845 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98d47741-7063-487f-a38b-b9c398f3e07e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.842450 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-dir\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.844338 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.844348 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.845032 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-audit-policies\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.845128 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-service-ca\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.845586 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-error\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.846006 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-router-certs\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.846173 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.847007 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-login\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.847121 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-session\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.848119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.849011 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.849277 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/34a4ebb3-154b-43c2-8566-8a638df7ecdf-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:24 crc kubenswrapper[4845]: I0202 10:35:24.857613 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgvqw\" (UniqueName: \"kubernetes.io/projected/34a4ebb3-154b-43c2-8566-8a638df7ecdf-kube-api-access-mgvqw\") pod \"oauth-openshift-6447dfb5d9-lnbfd\" (UID: \"34a4ebb3-154b-43c2-8566-8a638df7ecdf\") " pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.027506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.098259 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.100179 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-w989s"] Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.483086 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd"] Feb 02 10:35:25 crc kubenswrapper[4845]: W0202 10:35:25.491182 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a4ebb3_154b_43c2_8566_8a638df7ecdf.slice/crio-6982373ab744699d1914de4c3993d91b5e1f9b3638bba103bc665cbb6df59906 WatchSource:0}: Error finding container 6982373ab744699d1914de4c3993d91b5e1f9b3638bba103bc665cbb6df59906: Status 404 returned error can't find the container with id 6982373ab744699d1914de4c3993d91b5e1f9b3638bba103bc665cbb6df59906 Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.724354 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d47741-7063-487f-a38b-b9c398f3e07e" path="/var/lib/kubelet/pods/98d47741-7063-487f-a38b-b9c398f3e07e/volumes" Feb 02 10:35:25 crc kubenswrapper[4845]: I0202 10:35:25.758346 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" event={"ID":"34a4ebb3-154b-43c2-8566-8a638df7ecdf","Type":"ContainerStarted","Data":"6982373ab744699d1914de4c3993d91b5e1f9b3638bba103bc665cbb6df59906"} Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.764821 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" event={"ID":"34a4ebb3-154b-43c2-8566-8a638df7ecdf","Type":"ContainerStarted","Data":"5d3f37ade19ae8ca89f23cd7a194059f11dd56d8ec3ec70eaee679848e91d69d"} Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.765053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.773068 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.795935 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6447dfb5d9-lnbfd" podStartSLOduration=27.795909643 podStartE2EDuration="27.795909643s" podCreationTimestamp="2026-02-02 10:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:26.786560821 +0000 UTC m=+207.877962311" watchObservedRunningTime="2026-02-02 10:35:26.795909643 +0000 UTC m=+207.887311093" Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.985679 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:26 crc kubenswrapper[4845]: I0202 10:35:26.985940 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" podUID="5462ffc5-3458-4343-90de-625a307d56d0" containerName="controller-manager" containerID="cri-o://e682377b37f912142b25f90537ee5505ca0068713e5d79fc624dba607d356d8e" gracePeriod=30 Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.017751 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.018032 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" podUID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" containerName="route-controller-manager" containerID="cri-o://d824c9ea9648f75ba557aac1840de3f2484a4a72bc9492dd547ef7324c384bec" gracePeriod=30 Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.581549 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.772523 4845 generic.go:334] "Generic (PLEG): container finished" podID="5462ffc5-3458-4343-90de-625a307d56d0" containerID="e682377b37f912142b25f90537ee5505ca0068713e5d79fc624dba607d356d8e" exitCode=0 Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.772617 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" event={"ID":"5462ffc5-3458-4343-90de-625a307d56d0","Type":"ContainerDied","Data":"e682377b37f912142b25f90537ee5505ca0068713e5d79fc624dba607d356d8e"} Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.774334 4845 generic.go:334] "Generic (PLEG): container finished" podID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" containerID="d824c9ea9648f75ba557aac1840de3f2484a4a72bc9492dd547ef7324c384bec" exitCode=0 Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.774417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" event={"ID":"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9","Type":"ContainerDied","Data":"d824c9ea9648f75ba557aac1840de3f2484a4a72bc9492dd547ef7324c384bec"} Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.789077 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:35:27 crc kubenswrapper[4845]: I0202 10:35:27.827657 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.139565 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.183281 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:28 crc kubenswrapper[4845]: E0202 10:35:28.183518 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" containerName="route-controller-manager" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.183532 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" containerName="route-controller-manager" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.183649 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" containerName="route-controller-manager" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.184107 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.194243 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.202899 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config\") pod \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.202987 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert\") pod \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.203021 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxt9f\" (UniqueName: \"kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f\") pod \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.203072 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca\") pod \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\" (UID: \"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.204372 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" (UID: "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.204628 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config" (OuterVolumeSpecName: "config") pod "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" (UID: "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.208879 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" (UID: "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.211106 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f" (OuterVolumeSpecName: "kube-api-access-pxt9f") pod "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" (UID: "9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9"). InnerVolumeSpecName "kube-api-access-pxt9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.249265 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.304429 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config\") pod \"5462ffc5-3458-4343-90de-625a307d56d0\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.304827 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert\") pod \"5462ffc5-3458-4343-90de-625a307d56d0\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.305350 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config" (OuterVolumeSpecName: "config") pod "5462ffc5-3458-4343-90de-625a307d56d0" (UID: "5462ffc5-3458-4343-90de-625a307d56d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.305528 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2czr9\" (UniqueName: \"kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9\") pod \"5462ffc5-3458-4343-90de-625a307d56d0\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.305707 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca\") pod \"5462ffc5-3458-4343-90de-625a307d56d0\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.305854 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles\") pod \"5462ffc5-3458-4343-90de-625a307d56d0\" (UID: \"5462ffc5-3458-4343-90de-625a307d56d0\") " Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306233 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "5462ffc5-3458-4343-90de-625a307d56d0" (UID: "5462ffc5-3458-4343-90de-625a307d56d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306255 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306479 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmlm\" (UniqueName: \"kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306596 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306637 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5462ffc5-3458-4343-90de-625a307d56d0" (UID: "5462ffc5-3458-4343-90de-625a307d56d0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.306969 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307063 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307143 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307294 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307386 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5462ffc5-3458-4343-90de-625a307d56d0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307512 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307603 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxt9f\" (UniqueName: \"kubernetes.io/projected/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9-kube-api-access-pxt9f\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.307913 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5462ffc5-3458-4343-90de-625a307d56d0" (UID: "5462ffc5-3458-4343-90de-625a307d56d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.308150 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9" (OuterVolumeSpecName: "kube-api-access-2czr9") pod "5462ffc5-3458-4343-90de-625a307d56d0" (UID: "5462ffc5-3458-4343-90de-625a307d56d0"). InnerVolumeSpecName "kube-api-access-2czr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409271 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409347 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409368 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmlm\" (UniqueName: \"kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409397 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409433 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2czr9\" (UniqueName: \"kubernetes.io/projected/5462ffc5-3458-4343-90de-625a307d56d0-kube-api-access-2czr9\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.409445 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5462ffc5-3458-4343-90de-625a307d56d0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.410979 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.411460 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.416107 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.439021 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmlm\" (UniqueName: \"kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm\") pod \"route-controller-manager-6f88969cc8-2vg4q\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.504530 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.782824 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.782810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-785795fb9f-wz4dw" event={"ID":"5462ffc5-3458-4343-90de-625a307d56d0","Type":"ContainerDied","Data":"24a79b6e178f4c50f852ca424030cfff6044bf1a543bc4a2ad28acaa9dd3e5e1"} Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.782958 4845 scope.go:117] "RemoveContainer" containerID="e682377b37f912142b25f90537ee5505ca0068713e5d79fc624dba607d356d8e" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.785792 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.785851 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj" event={"ID":"9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9","Type":"ContainerDied","Data":"edd76338c4cf9565b4d4c3ff6adff77448c36e6ba5ba8384ccd4a0a7e5b2dd8a"} Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.804576 4845 scope.go:117] "RemoveContainer" containerID="d824c9ea9648f75ba557aac1840de3f2484a4a72bc9492dd547ef7324c384bec" Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.821961 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.832209 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-785795fb9f-wz4dw"] Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.836295 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.840653 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5f8f98956b-pr9zj"] Feb 02 10:35:28 crc kubenswrapper[4845]: I0202 10:35:28.985017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:28 crc kubenswrapper[4845]: W0202 10:35:28.996592 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21c318c_0429_4775_ac72_4556534d415e.slice/crio-75c70fe87f49a1adc8e4a51d12bf78e0c4d3ac5c8bc0b14cda22b9b93f914921 WatchSource:0}: Error finding container 75c70fe87f49a1adc8e4a51d12bf78e0c4d3ac5c8bc0b14cda22b9b93f914921: Status 404 returned error can't find the container with id 75c70fe87f49a1adc8e4a51d12bf78e0c4d3ac5c8bc0b14cda22b9b93f914921 Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.718910 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5462ffc5-3458-4343-90de-625a307d56d0" path="/var/lib/kubelet/pods/5462ffc5-3458-4343-90de-625a307d56d0/volumes" Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.719411 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9" path="/var/lib/kubelet/pods/9d5d1c9b-bf0d-4e5f-b70c-5460f17e53a9/volumes" Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.802016 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" event={"ID":"c21c318c-0429-4775-ac72-4556534d415e","Type":"ContainerStarted","Data":"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7"} Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.802055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" event={"ID":"c21c318c-0429-4775-ac72-4556534d415e","Type":"ContainerStarted","Data":"75c70fe87f49a1adc8e4a51d12bf78e0c4d3ac5c8bc0b14cda22b9b93f914921"} Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.802358 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.807948 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:29 crc kubenswrapper[4845]: I0202 10:35:29.824916 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" podStartSLOduration=2.824875891 podStartE2EDuration="2.824875891s" podCreationTimestamp="2026-02-02 10:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:29.822060999 +0000 UTC m=+210.913462459" watchObservedRunningTime="2026-02-02 10:35:29.824875891 +0000 UTC m=+210.916277341" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.209180 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:30 crc kubenswrapper[4845]: E0202 10:35:30.210064 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5462ffc5-3458-4343-90de-625a307d56d0" containerName="controller-manager" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.210228 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="5462ffc5-3458-4343-90de-625a307d56d0" containerName="controller-manager" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.210533 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="5462ffc5-3458-4343-90de-625a307d56d0" containerName="controller-manager" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.211287 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.216820 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.217015 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.218400 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.218991 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.219492 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.219916 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.232610 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.234977 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.337420 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.338133 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.338493 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7chfp\" (UniqueName: \"kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.338822 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.339141 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.441087 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.442118 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7chfp\" (UniqueName: \"kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.444237 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.444442 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.444562 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.443510 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.446316 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.446828 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.455205 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.472577 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7chfp\" (UniqueName: \"kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp\") pod \"controller-manager-598bc44594-t4nt7\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:30 crc kubenswrapper[4845]: I0202 10:35:30.548158 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:31 crc kubenswrapper[4845]: I0202 10:35:31.049400 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:31 crc kubenswrapper[4845]: W0202 10:35:31.054868 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70fc0f2d_6a96_4333_b777_9149d48db9a9.slice/crio-8da502ed26de0c24190d879fb4dd0874ff65fc05bf9ee8f199f7bf5b2e694da3 WatchSource:0}: Error finding container 8da502ed26de0c24190d879fb4dd0874ff65fc05bf9ee8f199f7bf5b2e694da3: Status 404 returned error can't find the container with id 8da502ed26de0c24190d879fb4dd0874ff65fc05bf9ee8f199f7bf5b2e694da3 Feb 02 10:35:31 crc kubenswrapper[4845]: I0202 10:35:31.819553 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" event={"ID":"70fc0f2d-6a96-4333-b777-9149d48db9a9","Type":"ContainerStarted","Data":"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb"} Feb 02 10:35:31 crc kubenswrapper[4845]: I0202 10:35:31.820082 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" event={"ID":"70fc0f2d-6a96-4333-b777-9149d48db9a9","Type":"ContainerStarted","Data":"8da502ed26de0c24190d879fb4dd0874ff65fc05bf9ee8f199f7bf5b2e694da3"} Feb 02 10:35:31 crc kubenswrapper[4845]: I0202 10:35:31.842637 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" podStartSLOduration=4.842622746 podStartE2EDuration="4.842622746s" podCreationTimestamp="2026-02-02 10:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:31.842220434 +0000 UTC m=+212.933621884" watchObservedRunningTime="2026-02-02 10:35:31.842622746 +0000 UTC m=+212.934024186" Feb 02 10:35:32 crc kubenswrapper[4845]: I0202 10:35:32.827526 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:32 crc kubenswrapper[4845]: I0202 10:35:32.833469 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.237602 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.238321 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.238390 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.239087 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.239177 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428" gracePeriod=600 Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.919371 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428" exitCode=0 Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.919471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428"} Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.919987 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3"} Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.957926 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:46 crc kubenswrapper[4845]: I0202 10:35:46.958161 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" podUID="70fc0f2d-6a96-4333-b777-9149d48db9a9" containerName="controller-manager" containerID="cri-o://cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb" gracePeriod=30 Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.049399 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.049650 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" podUID="c21c318c-0429-4775-ac72-4556534d415e" containerName="route-controller-manager" containerID="cri-o://648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7" gracePeriod=30 Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.602374 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.606975 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.684874 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles\") pod \"70fc0f2d-6a96-4333-b777-9149d48db9a9\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.684972 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config\") pod \"c21c318c-0429-4775-ac72-4556534d415e\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685022 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7chfp\" (UniqueName: \"kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp\") pod \"70fc0f2d-6a96-4333-b777-9149d48db9a9\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685096 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert\") pod \"70fc0f2d-6a96-4333-b777-9149d48db9a9\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685132 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca\") pod \"70fc0f2d-6a96-4333-b777-9149d48db9a9\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685227 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config\") pod \"70fc0f2d-6a96-4333-b777-9149d48db9a9\" (UID: \"70fc0f2d-6a96-4333-b777-9149d48db9a9\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685851 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "70fc0f2d-6a96-4333-b777-9149d48db9a9" (UID: "70fc0f2d-6a96-4333-b777-9149d48db9a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685910 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config" (OuterVolumeSpecName: "config") pod "c21c318c-0429-4775-ac72-4556534d415e" (UID: "c21c318c-0429-4775-ac72-4556534d415e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.685938 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config" (OuterVolumeSpecName: "config") pod "70fc0f2d-6a96-4333-b777-9149d48db9a9" (UID: "70fc0f2d-6a96-4333-b777-9149d48db9a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.686060 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "70fc0f2d-6a96-4333-b777-9149d48db9a9" (UID: "70fc0f2d-6a96-4333-b777-9149d48db9a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.686192 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca\") pod \"c21c318c-0429-4775-ac72-4556534d415e\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.686743 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca" (OuterVolumeSpecName: "client-ca") pod "c21c318c-0429-4775-ac72-4556534d415e" (UID: "c21c318c-0429-4775-ac72-4556534d415e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.686850 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwmlm\" (UniqueName: \"kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm\") pod \"c21c318c-0429-4775-ac72-4556534d415e\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687368 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert\") pod \"c21c318c-0429-4775-ac72-4556534d415e\" (UID: \"c21c318c-0429-4775-ac72-4556534d415e\") " Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687717 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687748 4845 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687768 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c21c318c-0429-4775-ac72-4556534d415e-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687785 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.687802 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70fc0f2d-6a96-4333-b777-9149d48db9a9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.691072 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm" (OuterVolumeSpecName: "kube-api-access-rwmlm") pod "c21c318c-0429-4775-ac72-4556534d415e" (UID: "c21c318c-0429-4775-ac72-4556534d415e"). InnerVolumeSpecName "kube-api-access-rwmlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.691156 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c21c318c-0429-4775-ac72-4556534d415e" (UID: "c21c318c-0429-4775-ac72-4556534d415e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.698633 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp" (OuterVolumeSpecName: "kube-api-access-7chfp") pod "70fc0f2d-6a96-4333-b777-9149d48db9a9" (UID: "70fc0f2d-6a96-4333-b777-9149d48db9a9"). InnerVolumeSpecName "kube-api-access-7chfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.698990 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "70fc0f2d-6a96-4333-b777-9149d48db9a9" (UID: "70fc0f2d-6a96-4333-b777-9149d48db9a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.788955 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwmlm\" (UniqueName: \"kubernetes.io/projected/c21c318c-0429-4775-ac72-4556534d415e-kube-api-access-rwmlm\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.789782 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c21c318c-0429-4775-ac72-4556534d415e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.789911 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7chfp\" (UniqueName: \"kubernetes.io/projected/70fc0f2d-6a96-4333-b777-9149d48db9a9-kube-api-access-7chfp\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.789931 4845 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70fc0f2d-6a96-4333-b777-9149d48db9a9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.925633 4845 generic.go:334] "Generic (PLEG): container finished" podID="c21c318c-0429-4775-ac72-4556534d415e" containerID="648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7" exitCode=0 Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.925719 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.925730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" event={"ID":"c21c318c-0429-4775-ac72-4556534d415e","Type":"ContainerDied","Data":"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7"} Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.926154 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q" event={"ID":"c21c318c-0429-4775-ac72-4556534d415e","Type":"ContainerDied","Data":"75c70fe87f49a1adc8e4a51d12bf78e0c4d3ac5c8bc0b14cda22b9b93f914921"} Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.926177 4845 scope.go:117] "RemoveContainer" containerID="648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.928880 4845 generic.go:334] "Generic (PLEG): container finished" podID="70fc0f2d-6a96-4333-b777-9149d48db9a9" containerID="cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb" exitCode=0 Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.928955 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" event={"ID":"70fc0f2d-6a96-4333-b777-9149d48db9a9","Type":"ContainerDied","Data":"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb"} Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.929001 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" event={"ID":"70fc0f2d-6a96-4333-b777-9149d48db9a9","Type":"ContainerDied","Data":"8da502ed26de0c24190d879fb4dd0874ff65fc05bf9ee8f199f7bf5b2e694da3"} Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.929009 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598bc44594-t4nt7" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.947108 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.950868 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f88969cc8-2vg4q"] Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.951037 4845 scope.go:117] "RemoveContainer" containerID="648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7" Feb 02 10:35:47 crc kubenswrapper[4845]: E0202 10:35:47.952739 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7\": container with ID starting with 648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7 not found: ID does not exist" containerID="648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.952774 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7"} err="failed to get container status \"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7\": rpc error: code = NotFound desc = could not find container \"648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7\": container with ID starting with 648b0f1b7867bb5fe6a652841b964331d54fca10f74d2f426e3d89d581b28ca7 not found: ID does not exist" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.952796 4845 scope.go:117] "RemoveContainer" containerID="cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.959709 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.963510 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-598bc44594-t4nt7"] Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.971955 4845 scope.go:117] "RemoveContainer" containerID="cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb" Feb 02 10:35:47 crc kubenswrapper[4845]: E0202 10:35:47.972488 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb\": container with ID starting with cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb not found: ID does not exist" containerID="cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb" Feb 02 10:35:47 crc kubenswrapper[4845]: I0202 10:35:47.972537 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb"} err="failed to get container status \"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb\": rpc error: code = NotFound desc = could not find container \"cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb\": container with ID starting with cbd30dac003bbb0eb946cba982acab9780e5a61b47e1d8afac620fceb739c7fb not found: ID does not exist" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.234339 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m"] Feb 02 10:35:48 crc kubenswrapper[4845]: E0202 10:35:48.235161 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70fc0f2d-6a96-4333-b777-9149d48db9a9" containerName="controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.235181 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="70fc0f2d-6a96-4333-b777-9149d48db9a9" containerName="controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: E0202 10:35:48.235210 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21c318c-0429-4775-ac72-4556534d415e" containerName="route-controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.235219 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21c318c-0429-4775-ac72-4556534d415e" containerName="route-controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.235355 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21c318c-0429-4775-ac72-4556534d415e" containerName="route-controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.235378 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="70fc0f2d-6a96-4333-b777-9149d48db9a9" containerName="controller-manager" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.236991 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.239362 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.240494 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.240680 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.241339 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.242110 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.242539 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.246161 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f44fddcf4-ffjng"] Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.246938 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.251658 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.252555 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.253784 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.254023 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.254272 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.255625 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.259303 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.260265 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m"] Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.266563 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f44fddcf4-ffjng"] Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295624 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-client-ca\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295670 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-config\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-serving-cert\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295851 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crk5q\" (UniqueName: \"kubernetes.io/projected/51c80373-5133-4cd9-9289-cc50013b0875-kube-api-access-crk5q\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295902 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtz49\" (UniqueName: \"kubernetes.io/projected/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-kube-api-access-mtz49\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.295984 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-client-ca\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.296037 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c80373-5133-4cd9-9289-cc50013b0875-serving-cert\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.296079 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-proxy-ca-bundles\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.296126 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-config\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397551 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-client-ca\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c80373-5133-4cd9-9289-cc50013b0875-serving-cert\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397622 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-proxy-ca-bundles\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-config\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397682 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-client-ca\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397702 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-config\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397736 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-serving-cert\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397775 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crk5q\" (UniqueName: \"kubernetes.io/projected/51c80373-5133-4cd9-9289-cc50013b0875-kube-api-access-crk5q\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.397797 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtz49\" (UniqueName: \"kubernetes.io/projected/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-kube-api-access-mtz49\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.398630 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-client-ca\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.398689 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-client-ca\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.399397 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-proxy-ca-bundles\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.399513 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-config\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.400073 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51c80373-5133-4cd9-9289-cc50013b0875-config\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.402757 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51c80373-5133-4cd9-9289-cc50013b0875-serving-cert\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.405827 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-serving-cert\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.420149 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtz49\" (UniqueName: \"kubernetes.io/projected/4f7fcdd8-93f8-4b27-b281-b94eaf0ce813-kube-api-access-mtz49\") pod \"controller-manager-6f44fddcf4-ffjng\" (UID: \"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813\") " pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.420722 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crk5q\" (UniqueName: \"kubernetes.io/projected/51c80373-5133-4cd9-9289-cc50013b0875-kube-api-access-crk5q\") pod \"route-controller-manager-764c9d5fcd-lwb8m\" (UID: \"51c80373-5133-4cd9-9289-cc50013b0875\") " pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.592472 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:48 crc kubenswrapper[4845]: I0202 10:35:48.601163 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.005957 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f44fddcf4-ffjng"] Feb 02 10:35:49 crc kubenswrapper[4845]: W0202 10:35:49.010277 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7fcdd8_93f8_4b27_b281_b94eaf0ce813.slice/crio-26db9b88bbf58d42f8e069a10eaa4bb6f6b1eb2c315d1b6aa45867eec78a545d WatchSource:0}: Error finding container 26db9b88bbf58d42f8e069a10eaa4bb6f6b1eb2c315d1b6aa45867eec78a545d: Status 404 returned error can't find the container with id 26db9b88bbf58d42f8e069a10eaa4bb6f6b1eb2c315d1b6aa45867eec78a545d Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.052255 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m"] Feb 02 10:35:49 crc kubenswrapper[4845]: W0202 10:35:49.058028 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c80373_5133_4cd9_9289_cc50013b0875.slice/crio-08cd64fa4a1ed19829fc653eb7a69b66bf52a7d455fb5ee1c4669305df19c9d5 WatchSource:0}: Error finding container 08cd64fa4a1ed19829fc653eb7a69b66bf52a7d455fb5ee1c4669305df19c9d5: Status 404 returned error can't find the container with id 08cd64fa4a1ed19829fc653eb7a69b66bf52a7d455fb5ee1c4669305df19c9d5 Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.720359 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70fc0f2d-6a96-4333-b777-9149d48db9a9" path="/var/lib/kubelet/pods/70fc0f2d-6a96-4333-b777-9149d48db9a9/volumes" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.721154 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21c318c-0429-4775-ac72-4556534d415e" path="/var/lib/kubelet/pods/c21c318c-0429-4775-ac72-4556534d415e/volumes" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.942037 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" event={"ID":"51c80373-5133-4cd9-9289-cc50013b0875","Type":"ContainerStarted","Data":"350548c497af7726933e5093f670e11e6ffee57506d1ac026195d5e29832b064"} Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.942359 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.942369 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" event={"ID":"51c80373-5133-4cd9-9289-cc50013b0875","Type":"ContainerStarted","Data":"08cd64fa4a1ed19829fc653eb7a69b66bf52a7d455fb5ee1c4669305df19c9d5"} Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.943611 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" event={"ID":"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813","Type":"ContainerStarted","Data":"2e46816a2a756cc92ce9e3d35496bd32dccef94d25b9bdd19a168af6da2f39ab"} Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.943653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" event={"ID":"4f7fcdd8-93f8-4b27-b281-b94eaf0ce813","Type":"ContainerStarted","Data":"26db9b88bbf58d42f8e069a10eaa4bb6f6b1eb2c315d1b6aa45867eec78a545d"} Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.947148 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.963877 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-764c9d5fcd-lwb8m" podStartSLOduration=2.963856164 podStartE2EDuration="2.963856164s" podCreationTimestamp="2026-02-02 10:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:49.960221768 +0000 UTC m=+231.051623228" watchObservedRunningTime="2026-02-02 10:35:49.963856164 +0000 UTC m=+231.055257624" Feb 02 10:35:49 crc kubenswrapper[4845]: I0202 10:35:49.988194 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" podStartSLOduration=3.988171294 podStartE2EDuration="3.988171294s" podCreationTimestamp="2026-02-02 10:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:49.984653191 +0000 UTC m=+231.076054641" watchObservedRunningTime="2026-02-02 10:35:49.988171294 +0000 UTC m=+231.079572744" Feb 02 10:35:50 crc kubenswrapper[4845]: I0202 10:35:50.949632 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:50 crc kubenswrapper[4845]: I0202 10:35:50.953795 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f44fddcf4-ffjng" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.944592 4845 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.945720 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.946051 4845 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.946560 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e" gracePeriod=15 Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.946590 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904" gracePeriod=15 Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.946675 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f" gracePeriod=15 Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.946779 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404" gracePeriod=15 Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.948687 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b" gracePeriod=15 Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.948841 4845 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949222 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949246 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949265 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949278 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949297 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949310 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949327 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949338 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949357 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949369 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949387 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949399 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949413 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949424 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949599 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949625 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949641 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949658 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949677 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949696 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:54 crc kubenswrapper[4845]: E0202 10:35:54.949918 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.949936 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:54 crc kubenswrapper[4845]: I0202 10:35:54.950125 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.035825 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.035909 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.035971 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.036012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.036048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.036095 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.036166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.036204 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137410 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137815 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137851 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137550 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137992 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138034 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138086 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137993 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138122 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138145 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138173 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.137931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138189 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.138298 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.987077 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.988575 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.989323 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904" exitCode=0 Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.989351 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b" exitCode=0 Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.989360 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404" exitCode=0 Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.989367 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f" exitCode=2 Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.989456 4845 scope.go:117] "RemoveContainer" containerID="5ff3296a372879cb87edb5cfc38562c90d60096e7990fafa14c9eb8a50fba1c5" Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.991177 4845 generic.go:334] "Generic (PLEG): container finished" podID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" containerID="706922acec942089a8fd3b03c826b07b1a0b641cf0a3415a88c290fcbe0bd620" exitCode=0 Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.991207 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"21f3600b-d9f0-4d8e-8a5a-2b03161164b4","Type":"ContainerDied","Data":"706922acec942089a8fd3b03c826b07b1a0b641cf0a3415a88c290fcbe0bd620"} Feb 02 10:35:55 crc kubenswrapper[4845]: I0202 10:35:55.991804 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.000402 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.334517 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.335821 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.336563 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.337114 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.469725 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.469967 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470180 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470260 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470370 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470515 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470949 4845 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.470983 4845 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.471001 4845 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.503735 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.504846 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.505423 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.571445 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access\") pod \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.571690 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock\") pod \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.571799 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir\") pod \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\" (UID: \"21f3600b-d9f0-4d8e-8a5a-2b03161164b4\") " Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.572082 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21f3600b-d9f0-4d8e-8a5a-2b03161164b4" (UID: "21f3600b-d9f0-4d8e-8a5a-2b03161164b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.572111 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock" (OuterVolumeSpecName: "var-lock") pod "21f3600b-d9f0-4d8e-8a5a-2b03161164b4" (UID: "21f3600b-d9f0-4d8e-8a5a-2b03161164b4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.577075 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21f3600b-d9f0-4d8e-8a5a-2b03161164b4" (UID: "21f3600b-d9f0-4d8e-8a5a-2b03161164b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.673767 4845 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.673811 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.673828 4845 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/21f3600b-d9f0-4d8e-8a5a-2b03161164b4-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:57 crc kubenswrapper[4845]: I0202 10:35:57.719569 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.010131 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"21f3600b-d9f0-4d8e-8a5a-2b03161164b4","Type":"ContainerDied","Data":"ae045aee676ac5379e234ac23abc2cda69a140b263c32736f6d478088a0d6ce2"} Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.010183 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.010185 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae045aee676ac5379e234ac23abc2cda69a140b263c32736f6d478088a0d6ce2" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.013164 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.013946 4845 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e" exitCode=0 Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.014017 4845 scope.go:117] "RemoveContainer" containerID="72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.014043 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.014766 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.015153 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.015492 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.015757 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.017185 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.017591 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.032577 4845 scope.go:117] "RemoveContainer" containerID="a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.050487 4845 scope.go:117] "RemoveContainer" containerID="58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.065269 4845 scope.go:117] "RemoveContainer" containerID="079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.081309 4845 scope.go:117] "RemoveContainer" containerID="ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.098851 4845 scope.go:117] "RemoveContainer" containerID="07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.117860 4845 scope.go:117] "RemoveContainer" containerID="72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.118880 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\": container with ID starting with 72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904 not found: ID does not exist" containerID="72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.118925 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904"} err="failed to get container status \"72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\": rpc error: code = NotFound desc = could not find container \"72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904\": container with ID starting with 72600d42e64c46b10166b8eaa27c0cf3472aca69343fd952de501981b2c34904 not found: ID does not exist" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.118946 4845 scope.go:117] "RemoveContainer" containerID="a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.119324 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\": container with ID starting with a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b not found: ID does not exist" containerID="a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.119347 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b"} err="failed to get container status \"a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\": rpc error: code = NotFound desc = could not find container \"a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b\": container with ID starting with a6d578bf4f9dc17f5b6a4273bab3aa60907f3144c5990669117db64ad4426b5b not found: ID does not exist" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.119360 4845 scope.go:117] "RemoveContainer" containerID="58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.119616 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\": container with ID starting with 58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404 not found: ID does not exist" containerID="58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.119670 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404"} err="failed to get container status \"58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\": rpc error: code = NotFound desc = could not find container \"58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404\": container with ID starting with 58191ca6c2bdd6b2996155235018e669f275c0e09dcb8e463c493ed2cfe53404 not found: ID does not exist" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.119686 4845 scope.go:117] "RemoveContainer" containerID="079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.121727 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\": container with ID starting with 079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f not found: ID does not exist" containerID="079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.121746 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f"} err="failed to get container status \"079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\": rpc error: code = NotFound desc = could not find container \"079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f\": container with ID starting with 079eded7443742e375dcd20de8d1b6b4e60a4d5de4289a483f009f0247894c8f not found: ID does not exist" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.121780 4845 scope.go:117] "RemoveContainer" containerID="ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.122101 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\": container with ID starting with ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e not found: ID does not exist" containerID="ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.122118 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e"} err="failed to get container status \"ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\": rpc error: code = NotFound desc = could not find container \"ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e\": container with ID starting with ee7ef152f60c567afaf821367923b75eff8761f1c286a2106de41b795d37902e not found: ID does not exist" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.122130 4845 scope.go:117] "RemoveContainer" containerID="07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850" Feb 02 10:35:58 crc kubenswrapper[4845]: E0202 10:35:58.122315 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\": container with ID starting with 07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850 not found: ID does not exist" containerID="07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850" Feb 02 10:35:58 crc kubenswrapper[4845]: I0202 10:35:58.122334 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850"} err="failed to get container status \"07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\": rpc error: code = NotFound desc = could not find container \"07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850\": container with ID starting with 07753d42b1f45141fa85e04e5ef07cd89d67c405b95a52b8f64aa0fa3785a850 not found: ID does not exist" Feb 02 10:35:59 crc kubenswrapper[4845]: I0202 10:35:59.714838 4845 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:59 crc kubenswrapper[4845]: I0202 10:35:59.715713 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:35:59 crc kubenswrapper[4845]: E0202 10:35:59.984528 4845 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:59 crc kubenswrapper[4845]: I0202 10:35:59.985654 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.025037 4845 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18906798ea8f0bd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:36:00.024538064 +0000 UTC m=+241.115939514,LastTimestamp:2026-02-02 10:36:00.024538064 +0000 UTC m=+241.115939514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:36:00 crc kubenswrapper[4845]: I0202 10:36:00.047374 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cee4ad5f52f15862d1d4518fd97b395ab4fe572eb89976c35570a9b62fe0884c"} Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.423986 4845 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.425612 4845 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.426009 4845 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.426268 4845 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.426493 4845 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:00 crc kubenswrapper[4845]: I0202 10:36:00.426526 4845 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.426742 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="200ms" Feb 02 10:36:00 crc kubenswrapper[4845]: E0202 10:36:00.627900 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="400ms" Feb 02 10:36:01 crc kubenswrapper[4845]: E0202 10:36:01.028567 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="800ms" Feb 02 10:36:01 crc kubenswrapper[4845]: I0202 10:36:01.054953 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446"} Feb 02 10:36:01 crc kubenswrapper[4845]: I0202 10:36:01.055710 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:01 crc kubenswrapper[4845]: E0202 10:36:01.055745 4845 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:36:01 crc kubenswrapper[4845]: E0202 10:36:01.830389 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="1.6s" Feb 02 10:36:02 crc kubenswrapper[4845]: E0202 10:36:02.064578 4845 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:36:03 crc kubenswrapper[4845]: E0202 10:36:03.431463 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="3.2s" Feb 02 10:36:05 crc kubenswrapper[4845]: E0202 10:36:05.872049 4845 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.224:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18906798ea8f0bd0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:36:00.024538064 +0000 UTC m=+241.115939514,LastTimestamp:2026-02-02 10:36:00.024538064 +0000 UTC m=+241.115939514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:36:06 crc kubenswrapper[4845]: E0202 10:36:06.632365 4845 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.224:6443: connect: connection refused" interval="6.4s" Feb 02 10:36:08 crc kubenswrapper[4845]: I0202 10:36:08.711689 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:08 crc kubenswrapper[4845]: I0202 10:36:08.713981 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:08 crc kubenswrapper[4845]: I0202 10:36:08.729941 4845 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:08 crc kubenswrapper[4845]: I0202 10:36:08.729980 4845 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:08 crc kubenswrapper[4845]: E0202 10:36:08.730496 4845 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:08 crc kubenswrapper[4845]: I0202 10:36:08.731151 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:08 crc kubenswrapper[4845]: W0202 10:36:08.752559 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-16759ce193aa3d4b5fec2e2cfc9415b9df22f822bca3fc667eeab2415abf27c4 WatchSource:0}: Error finding container 16759ce193aa3d4b5fec2e2cfc9415b9df22f822bca3fc667eeab2415abf27c4: Status 404 returned error can't find the container with id 16759ce193aa3d4b5fec2e2cfc9415b9df22f822bca3fc667eeab2415abf27c4 Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.107758 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.108013 4845 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf" exitCode=1 Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.108062 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf"} Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.108538 4845 scope.go:117] "RemoveContainer" containerID="60395e5b5c5225bf27cc1401c84b57081cb08f8b76c5f8ea9f0e472c2d37abdf" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.109215 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.109574 4845 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.110762 4845 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3752fb9bb94368a7ec84e8e8ccd1f9ab95621ec384cbde95acb1893f614bdde2" exitCode=0 Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.110782 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3752fb9bb94368a7ec84e8e8ccd1f9ab95621ec384cbde95acb1893f614bdde2"} Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.110798 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16759ce193aa3d4b5fec2e2cfc9415b9df22f822bca3fc667eeab2415abf27c4"} Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.110993 4845 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.111010 4845 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.111418 4845 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:09 crc kubenswrapper[4845]: E0202 10:36:09.111542 4845 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.111616 4845 status_manager.go:851] "Failed to get status for pod" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.224:6443: connect: connection refused" Feb 02 10:36:09 crc kubenswrapper[4845]: I0202 10:36:09.230940 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:36:10 crc kubenswrapper[4845]: I0202 10:36:10.121564 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:36:10 crc kubenswrapper[4845]: I0202 10:36:10.122049 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"06cca458fbbab7cb6f7429a24188d88f05f7680c5bd0a1399a150c84889692ba"} Feb 02 10:36:10 crc kubenswrapper[4845]: I0202 10:36:10.125549 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"31650c369a34fb4b7c02e1dfba30243032457402224de23d3fb3de47ea29a1e2"} Feb 02 10:36:10 crc kubenswrapper[4845]: I0202 10:36:10.125589 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81e6f74786af2bcab9994da100052bfe310fb2dceb31841baae11dd472280515"} Feb 02 10:36:10 crc kubenswrapper[4845]: I0202 10:36:10.125603 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e04b720f7623a4f32efd79a47e802542478ff82153780713d07d829d51384186"} Feb 02 10:36:11 crc kubenswrapper[4845]: I0202 10:36:11.134816 4845 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:11 crc kubenswrapper[4845]: I0202 10:36:11.135081 4845 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:11 crc kubenswrapper[4845]: I0202 10:36:11.135002 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a1a2e0959b946db6c2a17d60afb9d27f63f74d2f9576d5e12807882710f7978c"} Feb 02 10:36:11 crc kubenswrapper[4845]: I0202 10:36:11.135125 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7537b27493878f8bf00411d64cbadcfe74b4d897814ea7d5a7b3c16b1ab6692c"} Feb 02 10:36:11 crc kubenswrapper[4845]: I0202 10:36:11.135149 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:13 crc kubenswrapper[4845]: I0202 10:36:13.731961 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:13 crc kubenswrapper[4845]: I0202 10:36:13.732443 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:13 crc kubenswrapper[4845]: I0202 10:36:13.741377 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:14 crc kubenswrapper[4845]: I0202 10:36:14.178464 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:36:16 crc kubenswrapper[4845]: I0202 10:36:16.154166 4845 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:16 crc kubenswrapper[4845]: I0202 10:36:16.168168 4845 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d115014f-e828-4339-b181-ebb8a9f6d3cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e04b720f7623a4f32efd79a47e802542478ff82153780713d07d829d51384186\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31650c369a34fb4b7c02e1dfba30243032457402224de23d3fb3de47ea29a1e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81e6f74786af2bcab9994da100052bfe310fb2dceb31841baae11dd472280515\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:36:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7537b27493878f8bf00411d64cbadcfe74b4d897814ea7d5a7b3c16b1ab6692c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:36:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1a2e0959b946db6c2a17d60afb9d27f63f74d2f9576d5e12807882710f7978c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:36:10Z\\\"}}}],\\\"phase\\\":\\\"Running\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": pods \"kube-apiserver-crc\" not found" Feb 02 10:36:17 crc kubenswrapper[4845]: I0202 10:36:17.178600 4845 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:17 crc kubenswrapper[4845]: I0202 10:36:17.178645 4845 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:17 crc kubenswrapper[4845]: I0202 10:36:17.187016 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:17 crc kubenswrapper[4845]: I0202 10:36:17.191265 4845 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="414c7d44-51dd-4046-b309-84e94419d37b" Feb 02 10:36:18 crc kubenswrapper[4845]: I0202 10:36:18.185267 4845 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:18 crc kubenswrapper[4845]: I0202 10:36:18.185316 4845 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d115014f-e828-4339-b181-ebb8a9f6d3cf" Feb 02 10:36:19 crc kubenswrapper[4845]: I0202 10:36:19.230931 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:36:19 crc kubenswrapper[4845]: I0202 10:36:19.239233 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:36:19 crc kubenswrapper[4845]: I0202 10:36:19.743028 4845 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="414c7d44-51dd-4046-b309-84e94419d37b" Feb 02 10:36:20 crc kubenswrapper[4845]: I0202 10:36:20.206187 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:36:25 crc kubenswrapper[4845]: I0202 10:36:25.611427 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:36:26 crc kubenswrapper[4845]: I0202 10:36:26.247834 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:36:26 crc kubenswrapper[4845]: I0202 10:36:26.467345 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:36:26 crc kubenswrapper[4845]: I0202 10:36:26.665067 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:36:26 crc kubenswrapper[4845]: I0202 10:36:26.792730 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:36:26 crc kubenswrapper[4845]: I0202 10:36:26.974274 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.053183 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.085541 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.225097 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.348802 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.782448 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.948020 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:36:27 crc kubenswrapper[4845]: I0202 10:36:27.991941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.136020 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.373614 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.382953 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.522362 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.566050 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.586300 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.825571 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.832934 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:36:28 crc kubenswrapper[4845]: I0202 10:36:28.929797 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.205766 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.393683 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.569441 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.678170 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.682657 4845 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.902472 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.922386 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:36:29 crc kubenswrapper[4845]: I0202 10:36:29.941353 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.024146 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.048863 4845 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.232403 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.307961 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.325480 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.350698 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.399042 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.468378 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.495003 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.497995 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.534361 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.606433 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.655237 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.717803 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.783871 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.845003 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:36:30 crc kubenswrapper[4845]: I0202 10:36:30.914003 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.139576 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.241323 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.273465 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.343503 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.357257 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.471446 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.673977 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.798043 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.832800 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.866963 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.875372 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.882170 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.882564 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.890304 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.984118 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:36:31 crc kubenswrapper[4845]: I0202 10:36:31.993691 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.001702 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.035184 4845 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.134924 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.136542 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.185109 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.295958 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.370411 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.416337 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.437442 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.499856 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.569129 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.654692 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.658712 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.773530 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.789359 4845 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.800466 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.800565 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.812088 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.815496 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.834683 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.834660456 podStartE2EDuration="16.834660456s" podCreationTimestamp="2026-02-02 10:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:36:32.831842364 +0000 UTC m=+273.923243824" watchObservedRunningTime="2026-02-02 10:36:32.834660456 +0000 UTC m=+273.926061916" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.944661 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:36:32 crc kubenswrapper[4845]: I0202 10:36:32.979068 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.030270 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.066130 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.086797 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.123292 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.163150 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.167956 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.172182 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.180403 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.334595 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.367315 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.381753 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.456168 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.459911 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.512344 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.605756 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.657211 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.674496 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.704586 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.741601 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.851856 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.894388 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.913250 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:36:33 crc kubenswrapper[4845]: I0202 10:36:33.955642 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.210092 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.273251 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.325880 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.419657 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.451934 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.527952 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.626177 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.668645 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.672410 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.749530 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:36:34 crc kubenswrapper[4845]: I0202 10:36:34.989666 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.002850 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.020780 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.026770 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.065500 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.070848 4845 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.144062 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.146016 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.207727 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.269616 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.384726 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.444374 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.444441 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.444564 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.479179 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.482650 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.500081 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.518098 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.565246 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.582120 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.584588 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.585073 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.602620 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.617386 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.755764 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.800203 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.870359 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:36:35 crc kubenswrapper[4845]: I0202 10:36:35.894648 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.041432 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.244461 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.335707 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.371557 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.385650 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.470933 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.533880 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.539722 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.580019 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.634728 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.640346 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.649374 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.659649 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.695092 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.713697 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.762320 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.763133 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.828429 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.909588 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:36:36 crc kubenswrapper[4845]: I0202 10:36:36.999254 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.025972 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.074076 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.085532 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.129740 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.160541 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.313851 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.320744 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.355249 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.356482 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.377305 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.386507 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.387215 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.391355 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.397147 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.410965 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.470686 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.558222 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.711934 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.722892 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.729683 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:36:37 crc kubenswrapper[4845]: I0202 10:36:37.947213 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.028270 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.059263 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.219281 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.243442 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.249148 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.267547 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.292054 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.301021 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.366158 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.500747 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.512779 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.703802 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.712621 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.737015 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.752841 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.808789 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.938099 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.976161 4845 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:36:38 crc kubenswrapper[4845]: I0202 10:36:38.976468 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446" gracePeriod=5 Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.012630 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.098663 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.115396 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.146454 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.211165 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.313011 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.460613 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.674068 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.725534 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.746085 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.772081 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:36:39 crc kubenswrapper[4845]: I0202 10:36:39.803579 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.014878 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.156364 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.160229 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.269930 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.274968 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.326041 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.333795 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.340196 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.351015 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.395618 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.512268 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.529129 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.610419 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.645998 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.702434 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.715733 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.952396 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:36:40 crc kubenswrapper[4845]: I0202 10:36:40.990849 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.026450 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.059455 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.206363 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.395614 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.532861 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.585081 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.695094 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.747640 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.762572 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.762902 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nqqx9" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="registry-server" containerID="cri-o://5769d4596805c8a147c91069eb2109528eac8714acdda17a418b761289524bf3" gracePeriod=30 Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.775894 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.776130 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-srnzq" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="registry-server" containerID="cri-o://68d3140ebd20281e9eddc75449b750e9027bcd03b8ff2f48c1cab9686d75572d" gracePeriod=30 Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.792042 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.792323 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" containerID="cri-o://6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6" gracePeriod=30 Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.805931 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.806205 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xf5wp" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="registry-server" containerID="cri-o://692c027aeff58a06f776073818a159fb8a42b2840f14b77a4b11565409dfc3c8" gracePeriod=30 Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.816310 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.817213 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nb8f9" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="registry-server" containerID="cri-o://bbcd1edbd954385c8b6a4defc894e861ef2b8ac66447cad7befe95fddedb8599" gracePeriod=30 Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.855765 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ms22s"] Feb 02 10:36:41 crc kubenswrapper[4845]: E0202 10:36:41.855993 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.856005 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:41 crc kubenswrapper[4845]: E0202 10:36:41.856021 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" containerName="installer" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.856028 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" containerName="installer" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.856115 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.856129 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f3600b-d9f0-4d8e-8a5a-2b03161164b4" containerName="installer" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.856472 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.867205 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ms22s"] Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.902058 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.927798 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:36:41 crc kubenswrapper[4845]: I0202 10:36:41.935451 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.030878 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.045002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgdc\" (UniqueName: \"kubernetes.io/projected/9fc452cb-0731-44f6-aae8-bad730786d8a-kube-api-access-vdgdc\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.045094 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.045169 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.064710 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.078071 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.150796 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.151161 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.151190 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgdc\" (UniqueName: \"kubernetes.io/projected/9fc452cb-0731-44f6-aae8-bad730786d8a-kube-api-access-vdgdc\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.153701 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.156393 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9fc452cb-0731-44f6-aae8-bad730786d8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.168386 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgdc\" (UniqueName: \"kubernetes.io/projected/9fc452cb-0731-44f6-aae8-bad730786d8a-kube-api-access-vdgdc\") pod \"marketplace-operator-79b997595-ms22s\" (UID: \"9fc452cb-0731-44f6-aae8-bad730786d8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.208479 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.235454 4845 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.287256 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.353873 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca\") pod \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.353968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk7lz\" (UniqueName: \"kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz\") pod \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.354073 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics\") pod \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\" (UID: \"d2ddc114-bfc4-444f-aeb3-8d43d95bec09\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.353970 4845 generic.go:334] "Generic (PLEG): container finished" podID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerID="692c027aeff58a06f776073818a159fb8a42b2840f14b77a4b11565409dfc3c8" exitCode=0 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.354259 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerDied","Data":"692c027aeff58a06f776073818a159fb8a42b2840f14b77a4b11565409dfc3c8"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.354549 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d2ddc114-bfc4-444f-aeb3-8d43d95bec09" (UID: "d2ddc114-bfc4-444f-aeb3-8d43d95bec09"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.364717 4845 generic.go:334] "Generic (PLEG): container finished" podID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerID="5769d4596805c8a147c91069eb2109528eac8714acdda17a418b761289524bf3" exitCode=0 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.365086 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerDied","Data":"5769d4596805c8a147c91069eb2109528eac8714acdda17a418b761289524bf3"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.368101 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d2ddc114-bfc4-444f-aeb3-8d43d95bec09" (UID: "d2ddc114-bfc4-444f-aeb3-8d43d95bec09"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.370539 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz" (OuterVolumeSpecName: "kube-api-access-mk7lz") pod "d2ddc114-bfc4-444f-aeb3-8d43d95bec09" (UID: "d2ddc114-bfc4-444f-aeb3-8d43d95bec09"). InnerVolumeSpecName "kube-api-access-mk7lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.380267 4845 generic.go:334] "Generic (PLEG): container finished" podID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerID="6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6" exitCode=0 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.380357 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" event={"ID":"d2ddc114-bfc4-444f-aeb3-8d43d95bec09","Type":"ContainerDied","Data":"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.380385 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" event={"ID":"d2ddc114-bfc4-444f-aeb3-8d43d95bec09","Type":"ContainerDied","Data":"a4725b8556aef79f1893fb492aa385b87083299f9e257b7881345d1bcb5c2f73"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.380400 4845 scope.go:117] "RemoveContainer" containerID="6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.380507 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hr44j" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.396277 4845 generic.go:334] "Generic (PLEG): container finished" podID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerID="68d3140ebd20281e9eddc75449b750e9027bcd03b8ff2f48c1cab9686d75572d" exitCode=0 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.396363 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerDied","Data":"68d3140ebd20281e9eddc75449b750e9027bcd03b8ff2f48c1cab9686d75572d"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.405188 4845 generic.go:334] "Generic (PLEG): container finished" podID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerID="bbcd1edbd954385c8b6a4defc894e861ef2b8ac66447cad7befe95fddedb8599" exitCode=0 Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.405229 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerDied","Data":"bbcd1edbd954385c8b6a4defc894e861ef2b8ac66447cad7befe95fddedb8599"} Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.416009 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.420632 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hr44j"] Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.420772 4845 scope.go:117] "RemoveContainer" containerID="6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6" Feb 02 10:36:42 crc kubenswrapper[4845]: E0202 10:36:42.423313 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6\": container with ID starting with 6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6 not found: ID does not exist" containerID="6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.423349 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6"} err="failed to get container status \"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6\": rpc error: code = NotFound desc = could not find container \"6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6\": container with ID starting with 6ddec9957fc622ddd7645307da14e15fc1f9260e65077226b0be1226c3e80fb6 not found: ID does not exist" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.455833 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk7lz\" (UniqueName: \"kubernetes.io/projected/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-kube-api-access-mk7lz\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.455864 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.455877 4845 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2ddc114-bfc4-444f-aeb3-8d43d95bec09-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.467398 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.481099 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.484053 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.526372 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.587178 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.639204 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670516 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content\") pod \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670592 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8ql7\" (UniqueName: \"kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7\") pod \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670614 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content\") pod \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670633 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb5dl\" (UniqueName: \"kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl\") pod \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670665 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities\") pod \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\" (UID: \"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670695 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities\") pod \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670710 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities\") pod \"66911d31-17db-4d9e-b0c2-9cb699fc0778\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670725 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content\") pod \"66911d31-17db-4d9e-b0c2-9cb699fc0778\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670742 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content\") pod \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670777 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities\") pod \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\" (UID: \"3ceca4a8-b0dd-47cc-a1fe-818e984af772\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670801 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg7pg\" (UniqueName: \"kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg\") pod \"66911d31-17db-4d9e-b0c2-9cb699fc0778\" (UID: \"66911d31-17db-4d9e-b0c2-9cb699fc0778\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.670819 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55ffd\" (UniqueName: \"kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd\") pod \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\" (UID: \"b3624e54-1097-4ab1-bfff-d7e0f721f8f0\") " Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.671378 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities" (OuterVolumeSpecName: "utilities") pod "66911d31-17db-4d9e-b0c2-9cb699fc0778" (UID: "66911d31-17db-4d9e-b0c2-9cb699fc0778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.671921 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities" (OuterVolumeSpecName: "utilities") pod "b3624e54-1097-4ab1-bfff-d7e0f721f8f0" (UID: "b3624e54-1097-4ab1-bfff-d7e0f721f8f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.672027 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities" (OuterVolumeSpecName: "utilities") pod "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" (UID: "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.672534 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities" (OuterVolumeSpecName: "utilities") pod "3ceca4a8-b0dd-47cc-a1fe-818e984af772" (UID: "3ceca4a8-b0dd-47cc-a1fe-818e984af772"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.674274 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7" (OuterVolumeSpecName: "kube-api-access-c8ql7") pod "3ceca4a8-b0dd-47cc-a1fe-818e984af772" (UID: "3ceca4a8-b0dd-47cc-a1fe-818e984af772"). InnerVolumeSpecName "kube-api-access-c8ql7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.674309 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg" (OuterVolumeSpecName: "kube-api-access-fg7pg") pod "66911d31-17db-4d9e-b0c2-9cb699fc0778" (UID: "66911d31-17db-4d9e-b0c2-9cb699fc0778"). InnerVolumeSpecName "kube-api-access-fg7pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.674841 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl" (OuterVolumeSpecName: "kube-api-access-mb5dl") pod "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" (UID: "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06"). InnerVolumeSpecName "kube-api-access-mb5dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.688361 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd" (OuterVolumeSpecName: "kube-api-access-55ffd") pod "b3624e54-1097-4ab1-bfff-d7e0f721f8f0" (UID: "b3624e54-1097-4ab1-bfff-d7e0f721f8f0"). InnerVolumeSpecName "kube-api-access-55ffd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.705015 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ceca4a8-b0dd-47cc-a1fe-818e984af772" (UID: "3ceca4a8-b0dd-47cc-a1fe-818e984af772"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.728956 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" (UID: "fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.734745 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.734920 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3624e54-1097-4ab1-bfff-d7e0f721f8f0" (UID: "b3624e54-1097-4ab1-bfff-d7e0f721f8f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.762518 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ms22s"] Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772533 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772590 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg7pg\" (UniqueName: \"kubernetes.io/projected/66911d31-17db-4d9e-b0c2-9cb699fc0778-kube-api-access-fg7pg\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772604 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55ffd\" (UniqueName: \"kubernetes.io/projected/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-kube-api-access-55ffd\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772616 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772628 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8ql7\" (UniqueName: \"kubernetes.io/projected/3ceca4a8-b0dd-47cc-a1fe-818e984af772-kube-api-access-c8ql7\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772638 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ceca4a8-b0dd-47cc-a1fe-818e984af772-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772649 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb5dl\" (UniqueName: \"kubernetes.io/projected/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-kube-api-access-mb5dl\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772660 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772671 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772681 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.772691 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3624e54-1097-4ab1-bfff-d7e0f721f8f0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.775395 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.810873 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66911d31-17db-4d9e-b0c2-9cb699fc0778" (UID: "66911d31-17db-4d9e-b0c2-9cb699fc0778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:36:42 crc kubenswrapper[4845]: I0202 10:36:42.873467 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66911d31-17db-4d9e-b0c2-9cb699fc0778-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.351183 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.413247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xf5wp" event={"ID":"3ceca4a8-b0dd-47cc-a1fe-818e984af772","Type":"ContainerDied","Data":"0d84e03378117188ed8428e61cf660af9bf78e4894f24fa58ab65c169ebc8078"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.413305 4845 scope.go:117] "RemoveContainer" containerID="692c027aeff58a06f776073818a159fb8a42b2840f14b77a4b11565409dfc3c8" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.413608 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xf5wp" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.416462 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nqqx9" event={"ID":"fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06","Type":"ContainerDied","Data":"411d114dec4634d8215b7fca2758294946426abdcb66d052869b2ccdb984e078"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.416625 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nqqx9" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.420447 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-srnzq" event={"ID":"b3624e54-1097-4ab1-bfff-d7e0f721f8f0","Type":"ContainerDied","Data":"7489421b9d82561c3c59320133d54302d24308214065fb740dafa4f42a2056e8"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.420545 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-srnzq" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.423141 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" event={"ID":"9fc452cb-0731-44f6-aae8-bad730786d8a","Type":"ContainerStarted","Data":"880ea4fc915d7976764e84ba40b8dd836fe5217f2abeca4ef742fe07b59cbdf7"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.423177 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" event={"ID":"9fc452cb-0731-44f6-aae8-bad730786d8a","Type":"ContainerStarted","Data":"272368097c4ce28ad0896f9e461d21f2c134747a55110be66da3698f46d15052"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.423349 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.426071 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.426355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nb8f9" event={"ID":"66911d31-17db-4d9e-b0c2-9cb699fc0778","Type":"ContainerDied","Data":"16c241a98cc49754e7cd69effe7e44d81157009a11eabff0c36f137b39003b4c"} Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.426453 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nb8f9" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.429744 4845 scope.go:117] "RemoveContainer" containerID="9de0fd1bbbe5a9cccba7aa175bf6868c3db78e2836e4127189cb452f9838b8c4" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.438066 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ms22s" podStartSLOduration=2.437978281 podStartE2EDuration="2.437978281s" podCreationTimestamp="2026-02-02 10:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:36:43.437652972 +0000 UTC m=+284.529054462" watchObservedRunningTime="2026-02-02 10:36:43.437978281 +0000 UTC m=+284.529379731" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.466586 4845 scope.go:117] "RemoveContainer" containerID="df3414b89201cc98711df2db8d1c899c06cc968a324e0b8467c7e36e96868e51" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.484671 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.492011 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-srnzq"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.497843 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.504317 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nqqx9"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.507506 4845 scope.go:117] "RemoveContainer" containerID="5769d4596805c8a147c91069eb2109528eac8714acdda17a418b761289524bf3" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.511408 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.517111 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xf5wp"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.535116 4845 scope.go:117] "RemoveContainer" containerID="b1fd42c06508a5f7cc77e759a154a4b98880587d65c3430e3b9c376f3441f3e5" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.541846 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.546214 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nb8f9"] Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.552359 4845 scope.go:117] "RemoveContainer" containerID="9aa6e5661ba4da1e3f07e47952d04aef1b22875db8155df2a2c40d59b58e9f5c" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.569034 4845 scope.go:117] "RemoveContainer" containerID="68d3140ebd20281e9eddc75449b750e9027bcd03b8ff2f48c1cab9686d75572d" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.599175 4845 scope.go:117] "RemoveContainer" containerID="5a24fa9c118523cf61e0979a46080c1c45d6df89bfac951d244d917947874a3c" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.621479 4845 scope.go:117] "RemoveContainer" containerID="bce7fc681aa538cb76a68f4632435369a39f67dafd5cfa64c4b02c6ca032214c" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.633750 4845 scope.go:117] "RemoveContainer" containerID="bbcd1edbd954385c8b6a4defc894e861ef2b8ac66447cad7befe95fddedb8599" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.647606 4845 scope.go:117] "RemoveContainer" containerID="892e8dde5cc61beb8bab363fa98a836495168d8b933253d6114935636c04fbe1" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.659734 4845 scope.go:117] "RemoveContainer" containerID="ce02065ee7ea69eff70e052c4a44aab42be4dcba45489f2ca7fc1dcf482f400b" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.719045 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" path="/var/lib/kubelet/pods/3ceca4a8-b0dd-47cc-a1fe-818e984af772/volumes" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.720272 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" path="/var/lib/kubelet/pods/66911d31-17db-4d9e-b0c2-9cb699fc0778/volumes" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.721380 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" path="/var/lib/kubelet/pods/b3624e54-1097-4ab1-bfff-d7e0f721f8f0/volumes" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.723295 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" path="/var/lib/kubelet/pods/d2ddc114-bfc4-444f-aeb3-8d43d95bec09/volumes" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.724205 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" path="/var/lib/kubelet/pods/fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06/volumes" Feb 02 10:36:43 crc kubenswrapper[4845]: I0202 10:36:43.812935 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.066535 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.097006 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.097066 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189137 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189206 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189245 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189314 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189371 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189390 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189396 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189451 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189783 4845 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189796 4845 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189806 4845 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.189830 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.197540 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.291259 4845 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.291670 4845 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.450212 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.450656 4845 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446" exitCode=137 Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.450764 4845 scope.go:117] "RemoveContainer" containerID="52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.450938 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.500741 4845 scope.go:117] "RemoveContainer" containerID="52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446" Feb 02 10:36:44 crc kubenswrapper[4845]: E0202 10:36:44.504258 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446\": container with ID starting with 52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446 not found: ID does not exist" containerID="52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446" Feb 02 10:36:44 crc kubenswrapper[4845]: I0202 10:36:44.504476 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446"} err="failed to get container status \"52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446\": rpc error: code = NotFound desc = could not find container \"52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446\": container with ID starting with 52339319d77de91a7e9fe862ac17080ccdec88bb416fc929a78525d869b63446 not found: ID does not exist" Feb 02 10:36:45 crc kubenswrapper[4845]: I0202 10:36:45.722189 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 02 10:36:59 crc kubenswrapper[4845]: I0202 10:36:59.462125 4845 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.542250 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj"] Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.542972 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.542986 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543002 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543009 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543019 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543026 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543038 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543044 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543053 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543058 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543067 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543074 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543082 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543089 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543097 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543104 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543112 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543118 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543129 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543161 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543173 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543180 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="extract-utilities" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543190 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543197 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="extract-content" Feb 02 10:37:13 crc kubenswrapper[4845]: E0202 10:37:13.543207 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543215 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543311 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ceca4a8-b0dd-47cc-a1fe-818e984af772" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543324 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ddc114-bfc4-444f-aeb3-8d43d95bec09" containerName="marketplace-operator" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543330 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3624e54-1097-4ab1-bfff-d7e0f721f8f0" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543341 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd2ac18-0c6c-41fe-a9e8-423d6a94ce06" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543348 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="66911d31-17db-4d9e-b0c2-9cb699fc0778" containerName="registry-server" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.543718 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.546872 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.547655 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.550346 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.552265 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.552327 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.557778 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj"] Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.656556 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4cc1851-2948-4983-81e0-70137f12c223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.656751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f4cc1851-2948-4983-81e0-70137f12c223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.656830 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcr7h\" (UniqueName: \"kubernetes.io/projected/f4cc1851-2948-4983-81e0-70137f12c223-kube-api-access-vcr7h\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.758738 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f4cc1851-2948-4983-81e0-70137f12c223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.758826 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcr7h\" (UniqueName: \"kubernetes.io/projected/f4cc1851-2948-4983-81e0-70137f12c223-kube-api-access-vcr7h\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.758865 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4cc1851-2948-4983-81e0-70137f12c223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.760857 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f4cc1851-2948-4983-81e0-70137f12c223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.775130 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f4cc1851-2948-4983-81e0-70137f12c223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.777626 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcr7h\" (UniqueName: \"kubernetes.io/projected/f4cc1851-2948-4983-81e0-70137f12c223-kube-api-access-vcr7h\") pod \"cluster-monitoring-operator-6d5b84845-bzxrj\" (UID: \"f4cc1851-2948-4983-81e0-70137f12c223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:13 crc kubenswrapper[4845]: I0202 10:37:13.862154 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" Feb 02 10:37:14 crc kubenswrapper[4845]: I0202 10:37:14.318740 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj"] Feb 02 10:37:14 crc kubenswrapper[4845]: I0202 10:37:14.631562 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" event={"ID":"f4cc1851-2948-4983-81e0-70137f12c223","Type":"ContainerStarted","Data":"011698660f79c3f5901b8e6ea7cd5d8421033a50faaf212b382e44364a883db4"} Feb 02 10:37:17 crc kubenswrapper[4845]: I0202 10:37:17.652352 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" event={"ID":"f4cc1851-2948-4983-81e0-70137f12c223","Type":"ContainerStarted","Data":"f0e3afe6cb5d06975889cb71bb7fc76d5f9f7e46e9ee6c91e9065fe146ed995a"} Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.065959 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-bzxrj" podStartSLOduration=1.8998176180000002 podStartE2EDuration="5.065942749s" podCreationTimestamp="2026-02-02 10:37:13 +0000 UTC" firstStartedPulling="2026-02-02 10:37:14.328062483 +0000 UTC m=+315.419463943" lastFinishedPulling="2026-02-02 10:37:17.494187624 +0000 UTC m=+318.585589074" observedRunningTime="2026-02-02 10:37:17.676753481 +0000 UTC m=+318.768154931" watchObservedRunningTime="2026-02-02 10:37:18.065942749 +0000 UTC m=+319.157344199" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.068026 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl"] Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.068693 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.070297 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.077540 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl"] Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.098577 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-txvzb"] Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.099615 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.114919 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34701544-b4ad-4596-90fe-5783d7decc81-ca-trust-extracted\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115049 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7fx\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-kube-api-access-hg7fx\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115133 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34701544-b4ad-4596-90fe-5783d7decc81-installation-pull-secrets\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115149 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-bound-sa-token\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115170 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-registry-certificates\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115191 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-registry-tls\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.115211 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-trusted-ca\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.118549 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-txvzb"] Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.140527 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216524 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216602 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34701544-b4ad-4596-90fe-5783d7decc81-installation-pull-secrets\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216622 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-bound-sa-token\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: E0202 10:37:18.216639 4845 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:18 crc kubenswrapper[4845]: E0202 10:37:18.216699 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates podName:62e18897-4517-49ff-8a99-6a4447fa6a1e nodeName:}" failed. No retries permitted until 2026-02-02 10:37:18.716683288 +0000 UTC m=+319.808084738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-8zgnl" (UID: "62e18897-4517-49ff-8a99-6a4447fa6a1e") : secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216646 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-registry-certificates\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216761 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-registry-tls\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216785 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-trusted-ca\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216810 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34701544-b4ad-4596-90fe-5783d7decc81-ca-trust-extracted\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.216843 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7fx\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-kube-api-access-hg7fx\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.217453 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/34701544-b4ad-4596-90fe-5783d7decc81-ca-trust-extracted\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.217986 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-registry-certificates\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.218124 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34701544-b4ad-4596-90fe-5783d7decc81-trusted-ca\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.222573 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/34701544-b4ad-4596-90fe-5783d7decc81-installation-pull-secrets\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.223773 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-registry-tls\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.234490 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-bound-sa-token\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.240634 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7fx\" (UniqueName: \"kubernetes.io/projected/34701544-b4ad-4596-90fe-5783d7decc81-kube-api-access-hg7fx\") pod \"image-registry-66df7c8f76-txvzb\" (UID: \"34701544-b4ad-4596-90fe-5783d7decc81\") " pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.419685 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.722690 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:18 crc kubenswrapper[4845]: E0202 10:37:18.722938 4845 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:18 crc kubenswrapper[4845]: E0202 10:37:18.723204 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates podName:62e18897-4517-49ff-8a99-6a4447fa6a1e nodeName:}" failed. No retries permitted until 2026-02-02 10:37:19.72317779 +0000 UTC m=+320.814579320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-8zgnl" (UID: "62e18897-4517-49ff-8a99-6a4447fa6a1e") : secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:18 crc kubenswrapper[4845]: I0202 10:37:18.829255 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-txvzb"] Feb 02 10:37:18 crc kubenswrapper[4845]: W0202 10:37:18.836157 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34701544_b4ad_4596_90fe_5783d7decc81.slice/crio-4eb83a72b26f208c46d5c1090a36d4c461bcce89c009d27668f4c619abd1c85a WatchSource:0}: Error finding container 4eb83a72b26f208c46d5c1090a36d4c461bcce89c009d27668f4c619abd1c85a: Status 404 returned error can't find the container with id 4eb83a72b26f208c46d5c1090a36d4c461bcce89c009d27668f4c619abd1c85a Feb 02 10:37:19 crc kubenswrapper[4845]: I0202 10:37:19.665077 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" event={"ID":"34701544-b4ad-4596-90fe-5783d7decc81","Type":"ContainerStarted","Data":"47e48c94b9474400eb3edd6ec0ae59077835e53f43b12de7e77cec8911252e65"} Feb 02 10:37:19 crc kubenswrapper[4845]: I0202 10:37:19.665398 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" event={"ID":"34701544-b4ad-4596-90fe-5783d7decc81","Type":"ContainerStarted","Data":"4eb83a72b26f208c46d5c1090a36d4c461bcce89c009d27668f4c619abd1c85a"} Feb 02 10:37:19 crc kubenswrapper[4845]: I0202 10:37:19.665442 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:19 crc kubenswrapper[4845]: I0202 10:37:19.734604 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:19 crc kubenswrapper[4845]: E0202 10:37:19.734807 4845 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:19 crc kubenswrapper[4845]: E0202 10:37:19.734860 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates podName:62e18897-4517-49ff-8a99-6a4447fa6a1e nodeName:}" failed. No retries permitted until 2026-02-02 10:37:21.734845054 +0000 UTC m=+322.826246504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-8zgnl" (UID: "62e18897-4517-49ff-8a99-6a4447fa6a1e") : secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:21 crc kubenswrapper[4845]: I0202 10:37:21.762914 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:21 crc kubenswrapper[4845]: E0202 10:37:21.763095 4845 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:21 crc kubenswrapper[4845]: E0202 10:37:21.763175 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates podName:62e18897-4517-49ff-8a99-6a4447fa6a1e nodeName:}" failed. No retries permitted until 2026-02-02 10:37:25.763155928 +0000 UTC m=+326.854557378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-8zgnl" (UID: "62e18897-4517-49ff-8a99-6a4447fa6a1e") : secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.590289 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" podStartSLOduration=6.5902637649999996 podStartE2EDuration="6.590263765s" podCreationTimestamp="2026-02-02 10:37:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:37:19.684780843 +0000 UTC m=+320.776182353" watchObservedRunningTime="2026-02-02 10:37:24.590263765 +0000 UTC m=+325.681665245" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.595501 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-575p8"] Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.597426 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.602232 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.602933 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-575p8"] Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.701488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbcp\" (UniqueName: \"kubernetes.io/projected/5c039981-931c-408f-8185-4d22b3da04a3-kube-api-access-ncbcp\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.701878 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-utilities\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.701940 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-catalog-content\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.803515 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-utilities\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.803582 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-catalog-content\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.803626 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbcp\" (UniqueName: \"kubernetes.io/projected/5c039981-931c-408f-8185-4d22b3da04a3-kube-api-access-ncbcp\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.804100 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-utilities\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.804237 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c039981-931c-408f-8185-4d22b3da04a3-catalog-content\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.825862 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbcp\" (UniqueName: \"kubernetes.io/projected/5c039981-931c-408f-8185-4d22b3da04a3-kube-api-access-ncbcp\") pod \"redhat-operators-575p8\" (UID: \"5c039981-931c-408f-8185-4d22b3da04a3\") " pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:24 crc kubenswrapper[4845]: I0202 10:37:24.926251 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.374627 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-575p8"] Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.403029 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k66k5"] Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.412109 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.417657 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.426250 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k66k5"] Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.513718 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-catalog-content\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.513767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-utilities\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.513801 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfwhp\" (UniqueName: \"kubernetes.io/projected/26334878-6884-4481-b360-96927a5dd3d6-kube-api-access-jfwhp\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.616042 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-utilities\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.616159 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfwhp\" (UniqueName: \"kubernetes.io/projected/26334878-6884-4481-b360-96927a5dd3d6-kube-api-access-jfwhp\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.616324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-catalog-content\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.616990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-utilities\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.617154 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26334878-6884-4481-b360-96927a5dd3d6-catalog-content\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.636868 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfwhp\" (UniqueName: \"kubernetes.io/projected/26334878-6884-4481-b360-96927a5dd3d6-kube-api-access-jfwhp\") pod \"community-operators-k66k5\" (UID: \"26334878-6884-4481-b360-96927a5dd3d6\") " pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.707210 4845 generic.go:334] "Generic (PLEG): container finished" podID="5c039981-931c-408f-8185-4d22b3da04a3" containerID="4185cd9ca57ad1f640fcdd5b51bb9ed68a35a9f3bf1fc83ce8866beaa01a05a7" exitCode=0 Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.707332 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-575p8" event={"ID":"5c039981-931c-408f-8185-4d22b3da04a3","Type":"ContainerDied","Data":"4185cd9ca57ad1f640fcdd5b51bb9ed68a35a9f3bf1fc83ce8866beaa01a05a7"} Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.707430 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-575p8" event={"ID":"5c039981-931c-408f-8185-4d22b3da04a3","Type":"ContainerStarted","Data":"5e07e7e29ad116d41cb4e352652bcd9c0b766e5b86b1f30f73fdd657f741c84a"} Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.773949 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:25 crc kubenswrapper[4845]: I0202 10:37:25.818778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:25 crc kubenswrapper[4845]: E0202 10:37:25.818971 4845 secret.go:188] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:25 crc kubenswrapper[4845]: E0202 10:37:25.819064 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates podName:62e18897-4517-49ff-8a99-6a4447fa6a1e nodeName:}" failed. No retries permitted until 2026-02-02 10:37:33.819042394 +0000 UTC m=+334.910443844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates") pod "prometheus-operator-admission-webhook-f54c54754-8zgnl" (UID: "62e18897-4517-49ff-8a99-6a4447fa6a1e") : secret "prometheus-operator-admission-webhook-tls" not found Feb 02 10:37:26 crc kubenswrapper[4845]: I0202 10:37:26.213121 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k66k5"] Feb 02 10:37:26 crc kubenswrapper[4845]: I0202 10:37:26.717679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-575p8" event={"ID":"5c039981-931c-408f-8185-4d22b3da04a3","Type":"ContainerStarted","Data":"8aa89169d748fb501d984356eca507b6756bac12b478bb616287504751e15981"} Feb 02 10:37:26 crc kubenswrapper[4845]: I0202 10:37:26.720317 4845 generic.go:334] "Generic (PLEG): container finished" podID="26334878-6884-4481-b360-96927a5dd3d6" containerID="2eaf2b49b3f03a83d519ea62d5a2cc7e2077959126a0db897a3e0db2685de33b" exitCode=0 Feb 02 10:37:26 crc kubenswrapper[4845]: I0202 10:37:26.720367 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k66k5" event={"ID":"26334878-6884-4481-b360-96927a5dd3d6","Type":"ContainerDied","Data":"2eaf2b49b3f03a83d519ea62d5a2cc7e2077959126a0db897a3e0db2685de33b"} Feb 02 10:37:26 crc kubenswrapper[4845]: I0202 10:37:26.720401 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k66k5" event={"ID":"26334878-6884-4481-b360-96927a5dd3d6","Type":"ContainerStarted","Data":"cd91a19eeb5cd5a216d3ce05005a97dfd2017523c1751040ed87a9014eb73041"} Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.186084 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-skmdg"] Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.187166 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.190594 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.200204 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skmdg"] Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.338054 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjz5v\" (UniqueName: \"kubernetes.io/projected/7b143223-c383-4b6f-b221-c8908e9f93d9-kube-api-access-pjz5v\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.338112 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-catalog-content\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.338217 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-utilities\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.439760 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-utilities\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.439846 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-catalog-content\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.439880 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjz5v\" (UniqueName: \"kubernetes.io/projected/7b143223-c383-4b6f-b221-c8908e9f93d9-kube-api-access-pjz5v\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.440387 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-catalog-content\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.440412 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b143223-c383-4b6f-b221-c8908e9f93d9-utilities\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.460835 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjz5v\" (UniqueName: \"kubernetes.io/projected/7b143223-c383-4b6f-b221-c8908e9f93d9-kube-api-access-pjz5v\") pod \"certified-operators-skmdg\" (UID: \"7b143223-c383-4b6f-b221-c8908e9f93d9\") " pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.506159 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.727110 4845 generic.go:334] "Generic (PLEG): container finished" podID="5c039981-931c-408f-8185-4d22b3da04a3" containerID="8aa89169d748fb501d984356eca507b6756bac12b478bb616287504751e15981" exitCode=0 Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.727203 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-575p8" event={"ID":"5c039981-931c-408f-8185-4d22b3da04a3","Type":"ContainerDied","Data":"8aa89169d748fb501d984356eca507b6756bac12b478bb616287504751e15981"} Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.730907 4845 generic.go:334] "Generic (PLEG): container finished" podID="26334878-6884-4481-b360-96927a5dd3d6" containerID="289c9bb9ef02865f3a495638881c70ae558363e91b3e5697700d1a08ff36e684" exitCode=0 Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.730951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k66k5" event={"ID":"26334878-6884-4481-b360-96927a5dd3d6","Type":"ContainerDied","Data":"289c9bb9ef02865f3a495638881c70ae558363e91b3e5697700d1a08ff36e684"} Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.787414 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5vfjf"] Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.791052 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.797071 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.805489 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vfjf"] Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.910705 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-skmdg"] Feb 02 10:37:27 crc kubenswrapper[4845]: W0202 10:37:27.914519 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b143223_c383_4b6f_b221_c8908e9f93d9.slice/crio-20f386c92450c8ff41800329aa63cf123dc0d0d90cfaca1f1fff7a726f5e8cc8 WatchSource:0}: Error finding container 20f386c92450c8ff41800329aa63cf123dc0d0d90cfaca1f1fff7a726f5e8cc8: Status 404 returned error can't find the container with id 20f386c92450c8ff41800329aa63cf123dc0d0d90cfaca1f1fff7a726f5e8cc8 Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.947353 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrr2n\" (UniqueName: \"kubernetes.io/projected/ac22736d-1901-40bf-a17f-186de03c64bf-kube-api-access-qrr2n\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.947402 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-utilities\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:27 crc kubenswrapper[4845]: I0202 10:37:27.947422 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-catalog-content\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.048644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrr2n\" (UniqueName: \"kubernetes.io/projected/ac22736d-1901-40bf-a17f-186de03c64bf-kube-api-access-qrr2n\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.048794 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-utilities\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.048903 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-catalog-content\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.049324 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-utilities\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.049384 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac22736d-1901-40bf-a17f-186de03c64bf-catalog-content\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.073776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrr2n\" (UniqueName: \"kubernetes.io/projected/ac22736d-1901-40bf-a17f-186de03c64bf-kube-api-access-qrr2n\") pod \"redhat-marketplace-5vfjf\" (UID: \"ac22736d-1901-40bf-a17f-186de03c64bf\") " pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.114691 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.560680 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5vfjf"] Feb 02 10:37:28 crc kubenswrapper[4845]: W0202 10:37:28.567870 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac22736d_1901_40bf_a17f_186de03c64bf.slice/crio-93d578b2ad3caebaf3deb336250feb91f78935baf82fc1408035f753030f0971 WatchSource:0}: Error finding container 93d578b2ad3caebaf3deb336250feb91f78935baf82fc1408035f753030f0971: Status 404 returned error can't find the container with id 93d578b2ad3caebaf3deb336250feb91f78935baf82fc1408035f753030f0971 Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.740453 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-575p8" event={"ID":"5c039981-931c-408f-8185-4d22b3da04a3","Type":"ContainerStarted","Data":"4f307f275681b3e859122dad338abf9b818cae545e0852b05b8ff57d6e3aec99"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.744548 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k66k5" event={"ID":"26334878-6884-4481-b360-96927a5dd3d6","Type":"ContainerStarted","Data":"e19db76e594f3dc096438159a0257cb5b78fd1abe6d7052f0c8beb107062c261"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.749552 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac22736d-1901-40bf-a17f-186de03c64bf" containerID="b2bf91b0e8337aec57ec6e01c94edae195c113fb1c77c1805ac5eef6431fde43" exitCode=0 Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.749615 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vfjf" event={"ID":"ac22736d-1901-40bf-a17f-186de03c64bf","Type":"ContainerDied","Data":"b2bf91b0e8337aec57ec6e01c94edae195c113fb1c77c1805ac5eef6431fde43"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.749673 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vfjf" event={"ID":"ac22736d-1901-40bf-a17f-186de03c64bf","Type":"ContainerStarted","Data":"93d578b2ad3caebaf3deb336250feb91f78935baf82fc1408035f753030f0971"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.752522 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b143223-c383-4b6f-b221-c8908e9f93d9" containerID="c7851ad137271694bd602acbb342eb3239e9008be47f90ac9dd7087bde61e39f" exitCode=0 Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.752558 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skmdg" event={"ID":"7b143223-c383-4b6f-b221-c8908e9f93d9","Type":"ContainerDied","Data":"c7851ad137271694bd602acbb342eb3239e9008be47f90ac9dd7087bde61e39f"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.752580 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skmdg" event={"ID":"7b143223-c383-4b6f-b221-c8908e9f93d9","Type":"ContainerStarted","Data":"20f386c92450c8ff41800329aa63cf123dc0d0d90cfaca1f1fff7a726f5e8cc8"} Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.761964 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-575p8" podStartSLOduration=2.145685209 podStartE2EDuration="4.761941943s" podCreationTimestamp="2026-02-02 10:37:24 +0000 UTC" firstStartedPulling="2026-02-02 10:37:25.709163148 +0000 UTC m=+326.800564588" lastFinishedPulling="2026-02-02 10:37:28.325419872 +0000 UTC m=+329.416821322" observedRunningTime="2026-02-02 10:37:28.761380716 +0000 UTC m=+329.852782186" watchObservedRunningTime="2026-02-02 10:37:28.761941943 +0000 UTC m=+329.853343393" Feb 02 10:37:28 crc kubenswrapper[4845]: I0202 10:37:28.801474 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k66k5" podStartSLOduration=2.135580702 podStartE2EDuration="3.801451433s" podCreationTimestamp="2026-02-02 10:37:25 +0000 UTC" firstStartedPulling="2026-02-02 10:37:26.721997608 +0000 UTC m=+327.813399068" lastFinishedPulling="2026-02-02 10:37:28.387868349 +0000 UTC m=+329.479269799" observedRunningTime="2026-02-02 10:37:28.783077835 +0000 UTC m=+329.874479285" watchObservedRunningTime="2026-02-02 10:37:28.801451433 +0000 UTC m=+329.892852883" Feb 02 10:37:29 crc kubenswrapper[4845]: I0202 10:37:29.761033 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac22736d-1901-40bf-a17f-186de03c64bf" containerID="916c950dcfc4493f22b562df8546551e4d1d58c44d349f20d93bba22d20f1596" exitCode=0 Feb 02 10:37:29 crc kubenswrapper[4845]: I0202 10:37:29.761098 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vfjf" event={"ID":"ac22736d-1901-40bf-a17f-186de03c64bf","Type":"ContainerDied","Data":"916c950dcfc4493f22b562df8546551e4d1d58c44d349f20d93bba22d20f1596"} Feb 02 10:37:29 crc kubenswrapper[4845]: I0202 10:37:29.763126 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b143223-c383-4b6f-b221-c8908e9f93d9" containerID="b8b9f5c5d233746e1f26cc29696184f49f1715735404c137e2048c38b1618bd0" exitCode=0 Feb 02 10:37:29 crc kubenswrapper[4845]: I0202 10:37:29.763160 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skmdg" event={"ID":"7b143223-c383-4b6f-b221-c8908e9f93d9","Type":"ContainerDied","Data":"b8b9f5c5d233746e1f26cc29696184f49f1715735404c137e2048c38b1618bd0"} Feb 02 10:37:30 crc kubenswrapper[4845]: I0202 10:37:30.770449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5vfjf" event={"ID":"ac22736d-1901-40bf-a17f-186de03c64bf","Type":"ContainerStarted","Data":"116dbf112e1df83646e0fd988db1e4f727ad8eb134aca7ae54b19ef1ade22ea5"} Feb 02 10:37:30 crc kubenswrapper[4845]: I0202 10:37:30.772677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-skmdg" event={"ID":"7b143223-c383-4b6f-b221-c8908e9f93d9","Type":"ContainerStarted","Data":"1954d970a8ab3f295fb6eee4359d67d483db0625964b75e3c66b4b76a64fe467"} Feb 02 10:37:30 crc kubenswrapper[4845]: I0202 10:37:30.793035 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5vfjf" podStartSLOduration=2.34811045 podStartE2EDuration="3.793015318s" podCreationTimestamp="2026-02-02 10:37:27 +0000 UTC" firstStartedPulling="2026-02-02 10:37:28.751230477 +0000 UTC m=+329.842631927" lastFinishedPulling="2026-02-02 10:37:30.196135345 +0000 UTC m=+331.287536795" observedRunningTime="2026-02-02 10:37:30.78945493 +0000 UTC m=+331.880856380" watchObservedRunningTime="2026-02-02 10:37:30.793015318 +0000 UTC m=+331.884416768" Feb 02 10:37:30 crc kubenswrapper[4845]: I0202 10:37:30.805324 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-skmdg" podStartSLOduration=2.246818254 podStartE2EDuration="3.805301242s" podCreationTimestamp="2026-02-02 10:37:27 +0000 UTC" firstStartedPulling="2026-02-02 10:37:28.755474496 +0000 UTC m=+329.846875946" lastFinishedPulling="2026-02-02 10:37:30.313957484 +0000 UTC m=+331.405358934" observedRunningTime="2026-02-02 10:37:30.804797516 +0000 UTC m=+331.896198966" watchObservedRunningTime="2026-02-02 10:37:30.805301242 +0000 UTC m=+331.896702692" Feb 02 10:37:33 crc kubenswrapper[4845]: I0202 10:37:33.825340 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:33 crc kubenswrapper[4845]: I0202 10:37:33.839639 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/62e18897-4517-49ff-8a99-6a4447fa6a1e-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8zgnl\" (UID: \"62e18897-4517-49ff-8a99-6a4447fa6a1e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:33 crc kubenswrapper[4845]: I0202 10:37:33.984699 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:34 crc kubenswrapper[4845]: I0202 10:37:34.481611 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl"] Feb 02 10:37:34 crc kubenswrapper[4845]: I0202 10:37:34.795856 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" event={"ID":"62e18897-4517-49ff-8a99-6a4447fa6a1e","Type":"ContainerStarted","Data":"cf414cc8499360af2ec8668da8406791c87e4f3dfba5bf93e4ad21e80f2743ff"} Feb 02 10:37:34 crc kubenswrapper[4845]: I0202 10:37:34.927308 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:34 crc kubenswrapper[4845]: I0202 10:37:34.927411 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:34 crc kubenswrapper[4845]: I0202 10:37:34.975750 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:35 crc kubenswrapper[4845]: I0202 10:37:35.775249 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:35 crc kubenswrapper[4845]: I0202 10:37:35.775932 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:35 crc kubenswrapper[4845]: I0202 10:37:35.828623 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:35 crc kubenswrapper[4845]: I0202 10:37:35.855303 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-575p8" Feb 02 10:37:35 crc kubenswrapper[4845]: I0202 10:37:35.907616 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k66k5" Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.507052 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.507403 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.550401 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.821768 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" event={"ID":"62e18897-4517-49ff-8a99-6a4447fa6a1e","Type":"ContainerStarted","Data":"710d9d8207825262c0d5ee981005df7e99702ffd23e5db7ee40bc8d3b0a06dbf"} Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.844975 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" podStartSLOduration=17.679334727 podStartE2EDuration="19.844953399s" podCreationTimestamp="2026-02-02 10:37:18 +0000 UTC" firstStartedPulling="2026-02-02 10:37:34.497201443 +0000 UTC m=+335.588602933" lastFinishedPulling="2026-02-02 10:37:36.662820145 +0000 UTC m=+337.754221605" observedRunningTime="2026-02-02 10:37:37.839665609 +0000 UTC m=+338.931067069" watchObservedRunningTime="2026-02-02 10:37:37.844953399 +0000 UTC m=+338.936354859" Feb 02 10:37:37 crc kubenswrapper[4845]: I0202 10:37:37.865959 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-skmdg" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.114875 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.114978 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.164155 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.426607 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-txvzb" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.484270 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.828552 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.835425 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8zgnl" Feb 02 10:37:38 crc kubenswrapper[4845]: I0202 10:37:38.897739 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5vfjf" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.115729 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-f6hrw"] Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.117442 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.120841 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.120936 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-ww5h7" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.121123 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.121355 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.125418 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-f6hrw"] Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.225466 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56dea6f-784c-4bd6-b1fe-e34acac80980-metrics-client-ca\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.226051 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.226113 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.226137 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmk6\" (UniqueName: \"kubernetes.io/projected/b56dea6f-784c-4bd6-b1fe-e34acac80980-kube-api-access-9wmk6\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.327567 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.327657 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.327682 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmk6\" (UniqueName: \"kubernetes.io/projected/b56dea6f-784c-4bd6-b1fe-e34acac80980-kube-api-access-9wmk6\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.327718 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56dea6f-784c-4bd6-b1fe-e34acac80980-metrics-client-ca\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.328743 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b56dea6f-784c-4bd6-b1fe-e34acac80980-metrics-client-ca\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.333352 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.333922 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b56dea6f-784c-4bd6-b1fe-e34acac80980-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.355061 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmk6\" (UniqueName: \"kubernetes.io/projected/b56dea6f-784c-4bd6-b1fe-e34acac80980-kube-api-access-9wmk6\") pod \"prometheus-operator-db54df47d-f6hrw\" (UID: \"b56dea6f-784c-4bd6-b1fe-e34acac80980\") " pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.435251 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" Feb 02 10:37:39 crc kubenswrapper[4845]: I0202 10:37:39.937300 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-f6hrw"] Feb 02 10:37:39 crc kubenswrapper[4845]: W0202 10:37:39.950939 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56dea6f_784c_4bd6_b1fe_e34acac80980.slice/crio-fc8b71d598095f83c64f5effd4e64ff571a57aeedde349ddb6252a94f8e9f3a5 WatchSource:0}: Error finding container fc8b71d598095f83c64f5effd4e64ff571a57aeedde349ddb6252a94f8e9f3a5: Status 404 returned error can't find the container with id fc8b71d598095f83c64f5effd4e64ff571a57aeedde349ddb6252a94f8e9f3a5 Feb 02 10:37:40 crc kubenswrapper[4845]: I0202 10:37:40.839994 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" event={"ID":"b56dea6f-784c-4bd6-b1fe-e34acac80980","Type":"ContainerStarted","Data":"fc8b71d598095f83c64f5effd4e64ff571a57aeedde349ddb6252a94f8e9f3a5"} Feb 02 10:37:42 crc kubenswrapper[4845]: I0202 10:37:42.854972 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" event={"ID":"b56dea6f-784c-4bd6-b1fe-e34acac80980","Type":"ContainerStarted","Data":"42cc51f8184b2e91c8f117c40ac29dd0c0fad588bc744b945da4b9fa2d828ca9"} Feb 02 10:37:42 crc kubenswrapper[4845]: I0202 10:37:42.856114 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" event={"ID":"b56dea6f-784c-4bd6-b1fe-e34acac80980","Type":"ContainerStarted","Data":"8d009f466c90bcdad40b05d268720d63ada3d8fccaf4579a8b498f28e3040ced"} Feb 02 10:37:42 crc kubenswrapper[4845]: I0202 10:37:42.879414 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-f6hrw" podStartSLOduration=1.75905793 podStartE2EDuration="3.879387477s" podCreationTimestamp="2026-02-02 10:37:39 +0000 UTC" firstStartedPulling="2026-02-02 10:37:39.953291361 +0000 UTC m=+341.044692811" lastFinishedPulling="2026-02-02 10:37:42.073620908 +0000 UTC m=+343.165022358" observedRunningTime="2026-02-02 10:37:42.876962443 +0000 UTC m=+343.968363923" watchObservedRunningTime="2026-02-02 10:37:42.879387477 +0000 UTC m=+343.970788947" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.479306 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-x6klx"] Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.480950 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.483298 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.483503 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-hl897" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.483634 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.509902 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69"] Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.511719 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.513354 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.513602 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.516517 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.516760 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-lsl2f" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.545418 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-x6klx"] Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.553102 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69"] Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.575769 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jx8bv"] Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.577202 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.579937 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.584192 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-gddws" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.594194 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f5148935-61d4-4a95-9c57-7f1ee944dbfb-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601055 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601078 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcz2b\" (UniqueName: \"kubernetes.io/projected/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-api-access-bcz2b\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601126 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601151 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601223 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhf7b\" (UniqueName: \"kubernetes.io/projected/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-kube-api-access-bhf7b\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601257 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.601342 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702247 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702298 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702336 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14e66b69-f723-4777-be6f-a522117a2b5b-metrics-client-ca\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702368 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-tls\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702407 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-wtmp\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702430 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702454 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-textfile\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-root\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702511 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f5148935-61d4-4a95-9c57-7f1ee944dbfb-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702532 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702556 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcz2b\" (UniqueName: \"kubernetes.io/projected/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-api-access-bcz2b\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702580 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhdsv\" (UniqueName: \"kubernetes.io/projected/14e66b69-f723-4777-be6f-a522117a2b5b-kube-api-access-xhdsv\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702611 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702635 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702659 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702689 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-sys\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.702740 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhf7b\" (UniqueName: \"kubernetes.io/projected/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-kube-api-access-bhf7b\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: E0202 10:37:44.703250 4845 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Feb 02 10:37:44 crc kubenswrapper[4845]: E0202 10:37:44.704231 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls podName:f5148935-61d4-4a95-9c57-7f1ee944dbfb nodeName:}" failed. No retries permitted until 2026-02-02 10:37:45.204208566 +0000 UTC m=+346.295610016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-sdb69" (UID: "f5148935-61d4-4a95-9c57-7f1ee944dbfb") : secret "kube-state-metrics-tls" not found Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.703805 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f5148935-61d4-4a95-9c57-7f1ee944dbfb-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.703387 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.704360 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.704670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5148935-61d4-4a95-9c57-7f1ee944dbfb-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.710146 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.712430 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.727632 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhf7b\" (UniqueName: \"kubernetes.io/projected/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-kube-api-access-bhf7b\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.731394 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2ba7cee-a22a-4b49-8871-6e54a93e6ebd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-x6klx\" (UID: \"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.732563 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcz2b\" (UniqueName: \"kubernetes.io/projected/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-api-access-bcz2b\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804325 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-wtmp\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804371 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-textfile\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804399 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-root\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804429 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhdsv\" (UniqueName: \"kubernetes.io/projected/14e66b69-f723-4777-be6f-a522117a2b5b-kube-api-access-xhdsv\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804453 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-sys\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804536 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14e66b69-f723-4777-be6f-a522117a2b5b-metrics-client-ca\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.804561 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-tls\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.807197 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-tls\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.807417 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-wtmp\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.807645 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-textfile\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.807678 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-root\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.823470 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/14e66b69-f723-4777-be6f-a522117a2b5b-sys\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.823995 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14e66b69-f723-4777-be6f-a522117a2b5b-metrics-client-ca\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.828347 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14e66b69-f723-4777-be6f-a522117a2b5b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.839244 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.842602 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhdsv\" (UniqueName: \"kubernetes.io/projected/14e66b69-f723-4777-be6f-a522117a2b5b-kube-api-access-xhdsv\") pod \"node-exporter-jx8bv\" (UID: \"14e66b69-f723-4777-be6f-a522117a2b5b\") " pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:44 crc kubenswrapper[4845]: I0202 10:37:44.895180 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jx8bv" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.214246 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.220033 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f5148935-61d4-4a95-9c57-7f1ee944dbfb-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-sdb69\" (UID: \"f5148935-61d4-4a95-9c57-7f1ee944dbfb\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.301370 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-x6klx"] Feb 02 10:37:45 crc kubenswrapper[4845]: W0202 10:37:45.307001 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ba7cee_a22a_4b49_8871_6e54a93e6ebd.slice/crio-df4b604ce3a5e4429045bef8d6eb900430c8197fe4d05c711d94cf0c049c6280 WatchSource:0}: Error finding container df4b604ce3a5e4429045bef8d6eb900430c8197fe4d05c711d94cf0c049c6280: Status 404 returned error can't find the container with id df4b604ce3a5e4429045bef8d6eb900430c8197fe4d05c711d94cf0c049c6280 Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.444945 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.640915 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.643985 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651285 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651448 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651516 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651734 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651931 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.651993 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.652082 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-kfgx6" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.652097 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.659535 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.669780 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.703274 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69"] Feb 02 10:37:45 crc kubenswrapper[4845]: W0202 10:37:45.706387 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5148935_61d4_4a95_9c57_7f1ee944dbfb.slice/crio-1eb58e24425a7988f47e69ac7271f1377e407e0daa2215f9631f516dc76b0505 WatchSource:0}: Error finding container 1eb58e24425a7988f47e69ac7271f1377e407e0daa2215f9631f516dc76b0505: Status 404 returned error can't find the container with id 1eb58e24425a7988f47e69ac7271f1377e407e0daa2215f9631f516dc76b0505 Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848300 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848385 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-web-config\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848427 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848445 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848552 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848601 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848728 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848835 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-config-out\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848937 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848967 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.848990 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td9dx\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-kube-api-access-td9dx\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.873575 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" event={"ID":"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd","Type":"ContainerStarted","Data":"e16a9c5415275eb9256529c450eb29d6043c882384257d2697352f5dd6fc62d3"} Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.873620 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" event={"ID":"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd","Type":"ContainerStarted","Data":"df4b604ce3a5e4429045bef8d6eb900430c8197fe4d05c711d94cf0c049c6280"} Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.875310 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jx8bv" event={"ID":"14e66b69-f723-4777-be6f-a522117a2b5b","Type":"ContainerStarted","Data":"d8fd4544243bb752cf9fa52679e54e945c840c5bd5edd59957f2cfabc4fbdac4"} Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.876274 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" event={"ID":"f5148935-61d4-4a95-9c57-7f1ee944dbfb","Type":"ContainerStarted","Data":"1eb58e24425a7988f47e69ac7271f1377e407e0daa2215f9631f516dc76b0505"} Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951185 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-config-out\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951249 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951291 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951315 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951344 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td9dx\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-kube-api-access-td9dx\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951372 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-web-config\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951444 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951464 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951505 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951530 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.951568 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: E0202 10:37:45.951693 4845 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 02 10:37:45 crc kubenswrapper[4845]: E0202 10:37:45.951751 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls podName:f7d75d05-4636-4623-b561-1b2e713ac513 nodeName:}" failed. No retries permitted until 2026-02-02 10:37:46.451731396 +0000 UTC m=+347.543132846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f7d75d05-4636-4623-b561-1b2e713ac513") : secret "alertmanager-main-tls" not found Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.953025 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.953323 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.953541 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7d75d05-4636-4623-b561-1b2e713ac513-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.959170 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.962296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.964082 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.964817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.964919 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-web-config\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.965665 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7d75d05-4636-4623-b561-1b2e713ac513-config-out\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.966670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:45 crc kubenswrapper[4845]: I0202 10:37:45.972013 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td9dx\" (UniqueName: \"kubernetes.io/projected/f7d75d05-4636-4623-b561-1b2e713ac513-kube-api-access-td9dx\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.237755 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.237811 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:37:46 crc kubenswrapper[4845]: E0202 10:37:46.459790 4845 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.459968 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:46 crc kubenswrapper[4845]: E0202 10:37:46.460084 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls podName:f7d75d05-4636-4623-b561-1b2e713ac513 nodeName:}" failed. No retries permitted until 2026-02-02 10:37:47.460065279 +0000 UTC m=+348.551466729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f7d75d05-4636-4623-b561-1b2e713ac513") : secret "alertmanager-main-tls" not found Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.533561 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f47c49b7c-j9wsh"] Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.535132 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.544077 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.544122 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.545024 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-2hz75" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.545381 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.546139 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-9f1hp0m1quk94" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.550421 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f47c49b7c-j9wsh"] Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.552113 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.556107 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.664696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.666366 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.666550 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.666869 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-grpc-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.667026 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.667135 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07221aaf-e31a-42a4-8033-ba7ad6d21564-metrics-client-ca\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.667298 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.667439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bvwx\" (UniqueName: \"kubernetes.io/projected/07221aaf-e31a-42a4-8033-ba7ad6d21564-kube-api-access-5bvwx\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.768595 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-grpc-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.768990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769198 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07221aaf-e31a-42a4-8033-ba7ad6d21564-metrics-client-ca\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769346 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bvwx\" (UniqueName: \"kubernetes.io/projected/07221aaf-e31a-42a4-8033-ba7ad6d21564-kube-api-access-5bvwx\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769633 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769788 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.769976 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.770665 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/07221aaf-e31a-42a4-8033-ba7ad6d21564-metrics-client-ca\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.774059 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.775198 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.776083 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-grpc-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.776313 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.778006 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-tls\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.781930 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/07221aaf-e31a-42a4-8033-ba7ad6d21564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.787364 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bvwx\" (UniqueName: \"kubernetes.io/projected/07221aaf-e31a-42a4-8033-ba7ad6d21564-kube-api-access-5bvwx\") pod \"thanos-querier-f47c49b7c-j9wsh\" (UID: \"07221aaf-e31a-42a4-8033-ba7ad6d21564\") " pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.855291 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:46 crc kubenswrapper[4845]: I0202 10:37:46.885748 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" event={"ID":"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd","Type":"ContainerStarted","Data":"a8a2760978ab69931349876782201184bee731dfc6b3210a347e2ff846996665"} Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.415976 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f47c49b7c-j9wsh"] Feb 02 10:37:47 crc kubenswrapper[4845]: W0202 10:37:47.447089 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07221aaf_e31a_42a4_8033_ba7ad6d21564.slice/crio-bfed3d4492f47ecc8dc70e58356f7633806204028ffc251608fa59eceb6e08ca WatchSource:0}: Error finding container bfed3d4492f47ecc8dc70e58356f7633806204028ffc251608fa59eceb6e08ca: Status 404 returned error can't find the container with id bfed3d4492f47ecc8dc70e58356f7633806204028ffc251608fa59eceb6e08ca Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.479692 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.485230 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7d75d05-4636-4623-b561-1b2e713ac513-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7d75d05-4636-4623-b561-1b2e713ac513\") " pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.786390 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.891956 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"bfed3d4492f47ecc8dc70e58356f7633806204028ffc251608fa59eceb6e08ca"} Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.893724 4845 generic.go:334] "Generic (PLEG): container finished" podID="14e66b69-f723-4777-be6f-a522117a2b5b" containerID="5435b2531b4062c1a8e8e48efb799c8ef7ce24967ec1ed9d77c1825230e5377a" exitCode=0 Feb 02 10:37:47 crc kubenswrapper[4845]: I0202 10:37:47.893754 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jx8bv" event={"ID":"14e66b69-f723-4777-be6f-a522117a2b5b","Type":"ContainerDied","Data":"5435b2531b4062c1a8e8e48efb799c8ef7ce24967ec1ed9d77c1825230e5377a"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.905376 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" event={"ID":"d2ba7cee-a22a-4b49-8871-6e54a93e6ebd","Type":"ContainerStarted","Data":"ec2380ebbd79f655eb51ee2a934fe2f867e4b5abc36822ffd0ac45fcd508f7b0"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.915647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jx8bv" event={"ID":"14e66b69-f723-4777-be6f-a522117a2b5b","Type":"ContainerStarted","Data":"a9e6442ab44737a0bb960a0fb9b2b793c2f9525ccfb2c481b9f51c7108f6bf78"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.915689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jx8bv" event={"ID":"14e66b69-f723-4777-be6f-a522117a2b5b","Type":"ContainerStarted","Data":"f3cf301607078b11c59a124209db136776e9477d316d4651c1f1262e2884a007"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.925947 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" event={"ID":"f5148935-61d4-4a95-9c57-7f1ee944dbfb","Type":"ContainerStarted","Data":"fc43efbe8f84ad54806474e5b36014659a62e37ca077068a64698aa50a600b44"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.926025 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" event={"ID":"f5148935-61d4-4a95-9c57-7f1ee944dbfb","Type":"ContainerStarted","Data":"adc5a4d1188e3599e3e8115cfd681f96ff5178b4549e2418f7a9db182776df15"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:48.932021 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-x6klx" podStartSLOduration=2.485963937 podStartE2EDuration="4.932004218s" podCreationTimestamp="2026-02-02 10:37:44 +0000 UTC" firstStartedPulling="2026-02-02 10:37:45.950734246 +0000 UTC m=+347.042135696" lastFinishedPulling="2026-02-02 10:37:48.396774537 +0000 UTC m=+349.488175977" observedRunningTime="2026-02-02 10:37:48.929015507 +0000 UTC m=+350.020416957" watchObservedRunningTime="2026-02-02 10:37:48.932004218 +0000 UTC m=+350.023405668" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.258739 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jx8bv" podStartSLOduration=3.324197153 podStartE2EDuration="5.258719684s" podCreationTimestamp="2026-02-02 10:37:44 +0000 UTC" firstStartedPulling="2026-02-02 10:37:44.931527352 +0000 UTC m=+346.022928802" lastFinishedPulling="2026-02-02 10:37:46.866049883 +0000 UTC m=+347.957451333" observedRunningTime="2026-02-02 10:37:48.958622727 +0000 UTC m=+350.050024217" watchObservedRunningTime="2026-02-02 10:37:49.258719684 +0000 UTC m=+350.350121134" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.261639 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.262446 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.280404 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409677 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409727 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhj26\" (UniqueName: \"kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409813 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409871 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.409936 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.511798 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.511845 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.511902 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.511956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.511975 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.512006 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.512021 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhj26\" (UniqueName: \"kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.513089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.513282 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.513294 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.513384 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.517513 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.517926 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.530599 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhj26\" (UniqueName: \"kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26\") pod \"console-986c9fb64-5l8tt\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.578323 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.634325 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Feb 02 10:37:49 crc kubenswrapper[4845]: W0202 10:37:49.643360 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d75d05_4636_4623_b561_1b2e713ac513.slice/crio-bcf7fba67453e8f1dab59e9940c4fa120757e4fe8e9f7b2607bbe33225d17342 WatchSource:0}: Error finding container bcf7fba67453e8f1dab59e9940c4fa120757e4fe8e9f7b2607bbe33225d17342: Status 404 returned error can't find the container with id bcf7fba67453e8f1dab59e9940c4fa120757e4fe8e9f7b2607bbe33225d17342 Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.926664 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.928027 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.932011 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.932271 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.932507 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.933028 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.933151 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-f978s" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.933308 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-cjt1cflo97mfk" Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.939972 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" event={"ID":"f5148935-61d4-4a95-9c57-7f1ee944dbfb","Type":"ContainerStarted","Data":"95dcf38fcdf75b06770cc60660242daaefe50de763bda19388b7f3d4c05fe105"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.944536 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"bcf7fba67453e8f1dab59e9940c4fa120757e4fe8e9f7b2607bbe33225d17342"} Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.952043 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:37:49 crc kubenswrapper[4845]: I0202 10:37:49.977364 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-sdb69" podStartSLOduration=3.294567322 podStartE2EDuration="5.977347885s" podCreationTimestamp="2026-02-02 10:37:44 +0000 UTC" firstStartedPulling="2026-02-02 10:37:45.709185578 +0000 UTC m=+346.800587038" lastFinishedPulling="2026-02-02 10:37:48.391966151 +0000 UTC m=+349.483367601" observedRunningTime="2026-02-02 10:37:49.975605873 +0000 UTC m=+351.067007343" watchObservedRunningTime="2026-02-02 10:37:49.977347885 +0000 UTC m=+351.068749335" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:49.999208 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.132958 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133048 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133092 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133122 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfgfq\" (UniqueName: \"kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.133407 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234628 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234692 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234721 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234767 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234795 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgfq\" (UniqueName: \"kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234829 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.234915 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.235848 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.235941 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.236638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.247724 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.258573 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.259514 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.260375 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.261856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.264762 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.265992 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.266763 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgfq\" (UniqueName: \"kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq\") pod \"metrics-server-76d65679c8-9dw7p\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.270099 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.438317 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert\") pod \"monitoring-plugin-db4cbb94b-xntgm\" (UID: \"f5753406-2b60-4929-85b3-8e01c37218b3\") " pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.539459 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert\") pod \"monitoring-plugin-db4cbb94b-xntgm\" (UID: \"f5753406-2b60-4929-85b3-8e01c37218b3\") " pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.546296 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.547193 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert\") pod \"monitoring-plugin-db4cbb94b-xntgm\" (UID: \"f5753406-2b60-4929-85b3-8e01c37218b3\") " pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.615269 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.916870 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.921020 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.923228 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.923384 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.923551 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.930420 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.930792 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-7t4uaoa6dn2sc" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.931407 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.931559 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.931641 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.931686 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-6pqjf" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.931744 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.932051 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.935752 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.940328 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.943939 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970518 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970590 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-web-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970637 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970668 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970730 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970853 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970933 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.970989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.971012 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-645tt\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-kube-api-access-645tt\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.971051 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.971091 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config-out\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.971224 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.972368 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.972405 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.973082 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.982631 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.982916 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.983002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.988426 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-986c9fb64-5l8tt" event={"ID":"615f8561-b519-43b6-8864-9b1275443e98","Type":"ContainerStarted","Data":"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38"} Feb 02 10:37:50 crc kubenswrapper[4845]: I0202 10:37:50.988474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-986c9fb64-5l8tt" event={"ID":"615f8561-b519-43b6-8864-9b1275443e98","Type":"ContainerStarted","Data":"3fbb867d7ca80df191b9729b7982330f959b62e993162b6e2d6a5e4f58aec846"} Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.007536 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-986c9fb64-5l8tt" podStartSLOduration=2.007520252 podStartE2EDuration="2.007520252s" podCreationTimestamp="2026-02-02 10:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:37:51.001945563 +0000 UTC m=+352.093347013" watchObservedRunningTime="2026-02-02 10:37:51.007520252 +0000 UTC m=+352.098921702" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.084846 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.084937 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085027 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085091 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-web-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085119 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085147 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085164 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085197 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085212 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085238 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085255 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-645tt\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-kube-api-access-645tt\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085295 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config-out\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085335 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085421 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.085455 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.086060 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.086666 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.086993 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.100624 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-web-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.103556 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.103828 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-645tt\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-kube-api-access-645tt\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.103852 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.104763 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.104904 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.104953 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.109498 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.110564 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-config-out\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.110620 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.110728 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.111199 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.111341 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.111549 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.111663 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/92c6b031-d11f-4f89-84e5-bbbe36ea3bba-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"92c6b031-d11f-4f89-84e5-bbbe36ea3bba\") " pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.282666 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:37:51 crc kubenswrapper[4845]: I0202 10:37:51.873708 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:37:52 crc kubenswrapper[4845]: I0202 10:37:52.001503 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"682641897f869349de46992bb4ffa28e8d32740f2b7661ad69a3de1f754a28e0"} Feb 02 10:37:52 crc kubenswrapper[4845]: I0202 10:37:52.013471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" event={"ID":"f5753406-2b60-4929-85b3-8e01c37218b3","Type":"ContainerStarted","Data":"7967ac4b496b366ce99f5e32cc502e0dcf108c64a09981d769ca1e0919a03dd6"} Feb 02 10:37:52 crc kubenswrapper[4845]: I0202 10:37:52.183053 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:37:52 crc kubenswrapper[4845]: I0202 10:37:52.296147 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Feb 02 10:37:52 crc kubenswrapper[4845]: W0202 10:37:52.311383 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c6b031_d11f_4f89_84e5_bbbe36ea3bba.slice/crio-adbc1ecb282316c17d31de4343b73f6caf2a0ba2e99fae9631669fba198c3172 WatchSource:0}: Error finding container adbc1ecb282316c17d31de4343b73f6caf2a0ba2e99fae9631669fba198c3172: Status 404 returned error can't find the container with id adbc1ecb282316c17d31de4343b73f6caf2a0ba2e99fae9631669fba198c3172 Feb 02 10:37:53 crc kubenswrapper[4845]: I0202 10:37:53.020029 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" event={"ID":"ec05463b-fba2-442c-9ba3-893de7b61f92","Type":"ContainerStarted","Data":"9c81f12ccf75970d5f3c58822634af178231901d9efb8c97f05d144b2887fc22"} Feb 02 10:37:53 crc kubenswrapper[4845]: I0202 10:37:53.022796 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"265f52b96dae2af9ca161199641671dcd0244053ad41833a34413da5fcd7546c"} Feb 02 10:37:53 crc kubenswrapper[4845]: I0202 10:37:53.022826 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"b668e8441ad52954d88be68329fb6d82709073834dd9c8df2d5d66d5b1cf2377"} Feb 02 10:37:53 crc kubenswrapper[4845]: I0202 10:37:53.024061 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"adbc1ecb282316c17d31de4343b73f6caf2a0ba2e99fae9631669fba198c3172"} Feb 02 10:37:54 crc kubenswrapper[4845]: I0202 10:37:54.035695 4845 generic.go:334] "Generic (PLEG): container finished" podID="f7d75d05-4636-4623-b561-1b2e713ac513" containerID="3af08772545699ef7b9596cdd64da778a35bb868396259805a90b121052198a6" exitCode=0 Feb 02 10:37:54 crc kubenswrapper[4845]: I0202 10:37:54.036077 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerDied","Data":"3af08772545699ef7b9596cdd64da778a35bb868396259805a90b121052198a6"} Feb 02 10:37:54 crc kubenswrapper[4845]: I0202 10:37:54.041961 4845 generic.go:334] "Generic (PLEG): container finished" podID="92c6b031-d11f-4f89-84e5-bbbe36ea3bba" containerID="eaed1c05688f22ab8cc4f55eefa5c9f3f8e492a46e0715c07adb13c6a8ab0aa7" exitCode=0 Feb 02 10:37:54 crc kubenswrapper[4845]: I0202 10:37:54.041993 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerDied","Data":"eaed1c05688f22ab8cc4f55eefa5c9f3f8e492a46e0715c07adb13c6a8ab0aa7"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.067027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"29ebd4091191dd5397a8e1d783aefa4c8e4c8d5cd67293e6735c3bfbf5d39864"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.067664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"fd65c200971d2cb072efd12f4a9152d2ab17f8031c1ebee0923a018bfa090ffb"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.067768 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" event={"ID":"07221aaf-e31a-42a4-8033-ba7ad6d21564","Type":"ContainerStarted","Data":"8e0e74b1568991705ba743e41fce420885a125f511789138985579e8ddfbbf03"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.070920 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.083829 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.088342 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" event={"ID":"f5753406-2b60-4929-85b3-8e01c37218b3","Type":"ContainerStarted","Data":"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.088868 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.093998 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" event={"ID":"ec05463b-fba2-442c-9ba3-893de7b61f92","Type":"ContainerStarted","Data":"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b"} Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.101858 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.115103 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f47c49b7c-j9wsh" podStartSLOduration=2.484939295 podStartE2EDuration="11.115084423s" podCreationTimestamp="2026-02-02 10:37:46 +0000 UTC" firstStartedPulling="2026-02-02 10:37:47.450314993 +0000 UTC m=+348.541716443" lastFinishedPulling="2026-02-02 10:37:56.080460121 +0000 UTC m=+357.171861571" observedRunningTime="2026-02-02 10:37:57.100281923 +0000 UTC m=+358.191683373" watchObservedRunningTime="2026-02-02 10:37:57.115084423 +0000 UTC m=+358.206485873" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.121700 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" podStartSLOduration=4.228803675 podStartE2EDuration="8.121681843s" podCreationTimestamp="2026-02-02 10:37:49 +0000 UTC" firstStartedPulling="2026-02-02 10:37:52.193953327 +0000 UTC m=+353.285354777" lastFinishedPulling="2026-02-02 10:37:56.086831495 +0000 UTC m=+357.178232945" observedRunningTime="2026-02-02 10:37:57.120051114 +0000 UTC m=+358.211452574" watchObservedRunningTime="2026-02-02 10:37:57.121681843 +0000 UTC m=+358.213083293" Feb 02 10:37:57 crc kubenswrapper[4845]: I0202 10:37:57.170438 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" podStartSLOduration=3.008871634 podStartE2EDuration="7.170257279s" podCreationTimestamp="2026-02-02 10:37:50 +0000 UTC" firstStartedPulling="2026-02-02 10:37:51.918970823 +0000 UTC m=+353.010372273" lastFinishedPulling="2026-02-02 10:37:56.080356468 +0000 UTC m=+357.171757918" observedRunningTime="2026-02-02 10:37:57.164153684 +0000 UTC m=+358.255555134" watchObservedRunningTime="2026-02-02 10:37:57.170257279 +0000 UTC m=+358.261658749" Feb 02 10:37:59 crc kubenswrapper[4845]: I0202 10:37:59.118585 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"0b439b7c2d0e62e945e5d30cfadcd81c6a64716614daa9633b1b986427581639"} Feb 02 10:37:59 crc kubenswrapper[4845]: I0202 10:37:59.578849 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:59 crc kubenswrapper[4845]: I0202 10:37:59.579501 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:37:59 crc kubenswrapper[4845]: I0202 10:37:59.586171 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.132568 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"33eed5e5eabd757463b54e64495087f70ab77a391ebfacdef6d343ec4c9c0f68"} Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.132869 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"059d13b6797eb9ec594412fac039c0ccec1379fe9b99eb85abd2e9fd8ca21925"} Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.132905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"128f9ad9dd13bf8b1b33334c3bc7c4418d334f8dd5688208e95cb87f6f725961"} Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.132915 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"83cf0123e5b040f9cf412941034f94b55d62908644c685237edafad816fa3358"} Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.144210 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:38:00 crc kubenswrapper[4845]: I0202 10:38:00.209475 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:38:03 crc kubenswrapper[4845]: I0202 10:38:03.522801 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" podUID="339fe372-b3de-4832-b32f-0218d2c0545b" containerName="registry" containerID="cri-o://c850483e123504ea082a0c8f17db4867ee8a686d50fc85724804f4fd70d8bc85" gracePeriod=30 Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.157064 4845 generic.go:334] "Generic (PLEG): container finished" podID="339fe372-b3de-4832-b32f-0218d2c0545b" containerID="c850483e123504ea082a0c8f17db4867ee8a686d50fc85724804f4fd70d8bc85" exitCode=0 Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.157181 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" event={"ID":"339fe372-b3de-4832-b32f-0218d2c0545b","Type":"ContainerDied","Data":"c850483e123504ea082a0c8f17db4867ee8a686d50fc85724804f4fd70d8bc85"} Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.419179 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539381 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g77qx\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539596 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539636 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539664 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539697 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539762 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539787 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.539806 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted\") pod \"339fe372-b3de-4832-b32f-0218d2c0545b\" (UID: \"339fe372-b3de-4832-b32f-0218d2c0545b\") " Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.541629 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.541962 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.544644 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.544748 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx" (OuterVolumeSpecName: "kube-api-access-g77qx") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "kube-api-access-g77qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.545143 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.546004 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.555231 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.564618 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "339fe372-b3de-4832-b32f-0218d2c0545b" (UID: "339fe372-b3de-4832-b32f-0218d2c0545b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641507 4845 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/339fe372-b3de-4832-b32f-0218d2c0545b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641550 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g77qx\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-kube-api-access-g77qx\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641562 4845 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641574 4845 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/339fe372-b3de-4832-b32f-0218d2c0545b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641586 4845 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/339fe372-b3de-4832-b32f-0218d2c0545b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641595 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:04 crc kubenswrapper[4845]: I0202 10:38:04.641606 4845 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/339fe372-b3de-4832-b32f-0218d2c0545b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.167591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7d75d05-4636-4623-b561-1b2e713ac513","Type":"ContainerStarted","Data":"187f0b0a3c316873ab9f0432efcf0de8df4d4671188a910657b770cedee2ce3e"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182589 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"bb59568d7e6dbfa618e30600b9ae3490b6053428641e757c82de3683040bd144"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182641 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"497b8db88a0d2799c943bdf6e0b2dd9ea3601a6319a6d26a89fe39d13e3935ad"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182657 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"082d4181cf6af1c706594ceca5f14af36c661156c164848bceea4b0433d7d800"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182670 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"c74028304991a6503334b572eae0b2b4d575fc9767d97841c6e56fc9ba5bea6c"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182686 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"44bfc48da59cb2a817c495e024cd0b7088079bba61ad395d1a49701cd6bdb38c"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.182697 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"92c6b031-d11f-4f89-84e5-bbbe36ea3bba","Type":"ContainerStarted","Data":"a67cb8fac6f2d8b0a183276567f2f8aff5b39592b775f317bc448fb884e0e5c2"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.183828 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" event={"ID":"339fe372-b3de-4832-b32f-0218d2c0545b","Type":"ContainerDied","Data":"b02dab70c0d575fa03917d15a317aa67ee29edaf5ecdee7aa9680da7630022b9"} Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.183875 4845 scope.go:117] "RemoveContainer" containerID="c850483e123504ea082a0c8f17db4867ee8a686d50fc85724804f4fd70d8bc85" Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.184000 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-thf72" Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.216061 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=10.954754352 podStartE2EDuration="20.216045593s" podCreationTimestamp="2026-02-02 10:37:45 +0000 UTC" firstStartedPulling="2026-02-02 10:37:49.646460004 +0000 UTC m=+350.737861454" lastFinishedPulling="2026-02-02 10:37:58.907751245 +0000 UTC m=+359.999152695" observedRunningTime="2026-02-02 10:38:05.213179786 +0000 UTC m=+366.304581236" watchObservedRunningTime="2026-02-02 10:38:05.216045593 +0000 UTC m=+366.307447043" Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.236758 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.244074 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-thf72"] Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.266085 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.267291787 podStartE2EDuration="15.266059793s" podCreationTimestamp="2026-02-02 10:37:50 +0000 UTC" firstStartedPulling="2026-02-02 10:37:52.319310246 +0000 UTC m=+353.410711696" lastFinishedPulling="2026-02-02 10:38:04.318078252 +0000 UTC m=+365.409479702" observedRunningTime="2026-02-02 10:38:05.263498245 +0000 UTC m=+366.354899705" watchObservedRunningTime="2026-02-02 10:38:05.266059793 +0000 UTC m=+366.357461243" Feb 02 10:38:05 crc kubenswrapper[4845]: I0202 10:38:05.725839 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339fe372-b3de-4832-b32f-0218d2c0545b" path="/var/lib/kubelet/pods/339fe372-b3de-4832-b32f-0218d2c0545b/volumes" Feb 02 10:38:06 crc kubenswrapper[4845]: I0202 10:38:06.284087 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:38:10 crc kubenswrapper[4845]: I0202 10:38:10.547971 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:38:10 crc kubenswrapper[4845]: I0202 10:38:10.548590 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:38:16 crc kubenswrapper[4845]: I0202 10:38:16.238229 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:38:16 crc kubenswrapper[4845]: I0202 10:38:16.238778 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.253278 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8gjpm" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" containerID="cri-o://966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976" gracePeriod=15 Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.610386 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8gjpm_04d41e42-423a-4bac-bc05-3c424c978fd8/console/0.log" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.610490 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664400 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664477 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664537 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664595 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664654 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664694 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bstpx\" (UniqueName: \"kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.664828 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert\") pod \"04d41e42-423a-4bac-bc05-3c424c978fd8\" (UID: \"04d41e42-423a-4bac-bc05-3c424c978fd8\") " Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.667492 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.667619 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca" (OuterVolumeSpecName: "service-ca") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.667670 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config" (OuterVolumeSpecName: "console-config") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.667778 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.674268 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.674770 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx" (OuterVolumeSpecName: "kube-api-access-bstpx") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "kube-api-access-bstpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.680259 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "04d41e42-423a-4bac-bc05-3c424c978fd8" (UID: "04d41e42-423a-4bac-bc05-3c424c978fd8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767485 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767526 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bstpx\" (UniqueName: \"kubernetes.io/projected/04d41e42-423a-4bac-bc05-3c424c978fd8-kube-api-access-bstpx\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767536 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767544 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767553 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767561 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/04d41e42-423a-4bac-bc05-3c424c978fd8-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:25 crc kubenswrapper[4845]: I0202 10:38:25.767569 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/04d41e42-423a-4bac-bc05-3c424c978fd8-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317164 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8gjpm_04d41e42-423a-4bac-bc05-3c424c978fd8/console/0.log" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317272 4845 generic.go:334] "Generic (PLEG): container finished" podID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerID="966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976" exitCode=2 Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317338 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8gjpm" event={"ID":"04d41e42-423a-4bac-bc05-3c424c978fd8","Type":"ContainerDied","Data":"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976"} Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317382 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8gjpm" event={"ID":"04d41e42-423a-4bac-bc05-3c424c978fd8","Type":"ContainerDied","Data":"983de89da05d00215d904c21a6798495b0919b3a61d3d8890b2f47a3f16bcb7f"} Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317405 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8gjpm" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.317435 4845 scope.go:117] "RemoveContainer" containerID="966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.337768 4845 scope.go:117] "RemoveContainer" containerID="966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976" Feb 02 10:38:26 crc kubenswrapper[4845]: E0202 10:38:26.338784 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976\": container with ID starting with 966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976 not found: ID does not exist" containerID="966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.338817 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976"} err="failed to get container status \"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976\": rpc error: code = NotFound desc = could not find container \"966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976\": container with ID starting with 966d7e4cb3833109e9bffa9e6cf13b2c4adf6ec09ec74e2fce7c96aacec9b976 not found: ID does not exist" Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.346076 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:38:26 crc kubenswrapper[4845]: I0202 10:38:26.352354 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8gjpm"] Feb 02 10:38:27 crc kubenswrapper[4845]: I0202 10:38:27.722168 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" path="/var/lib/kubelet/pods/04d41e42-423a-4bac-bc05-3c424c978fd8/volumes" Feb 02 10:38:30 crc kubenswrapper[4845]: I0202 10:38:30.555801 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:38:30 crc kubenswrapper[4845]: I0202 10:38:30.559952 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.237478 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.238280 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.238352 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.239187 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.239303 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3" gracePeriod=600 Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.457225 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3" exitCode=0 Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.457625 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3"} Feb 02 10:38:46 crc kubenswrapper[4845]: I0202 10:38:46.457667 4845 scope.go:117] "RemoveContainer" containerID="5c10f188dedfdab51fcb9b4bb57eb7eba62d3895c360748e15e01ffa4d95a428" Feb 02 10:38:47 crc kubenswrapper[4845]: I0202 10:38:47.467617 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582"} Feb 02 10:38:51 crc kubenswrapper[4845]: I0202 10:38:51.283806 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:38:51 crc kubenswrapper[4845]: I0202 10:38:51.317992 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:38:51 crc kubenswrapper[4845]: I0202 10:38:51.523543 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.889211 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.890268 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" podUID="ec05463b-fba2-442c-9ba3-893de7b61f92" containerName="metrics-server" containerID="cri-o://a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b" gracePeriod=170 Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.894661 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-56bfc86c66-lp8fz"] Feb 02 10:39:08 crc kubenswrapper[4845]: E0202 10:39:08.895131 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339fe372-b3de-4832-b32f-0218d2c0545b" containerName="registry" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.895156 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="339fe372-b3de-4832-b32f-0218d2c0545b" containerName="registry" Feb 02 10:39:08 crc kubenswrapper[4845]: E0202 10:39:08.895169 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.895179 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.895319 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d41e42-423a-4bac-bc05-3c424c978fd8" containerName="console" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.895343 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="339fe372-b3de-4832-b32f-0218d2c0545b" containerName="registry" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.895911 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.903552 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-b166r02mp1sa7" Feb 02 10:39:08 crc kubenswrapper[4845]: I0202 10:39:08.918783 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56bfc86c66-lp8fz"] Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.093915 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-client-certs\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.093990 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7e5424af-7d57-4d12-be1c-dcddc1187cdf-audit-log\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.094017 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-server-tls\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.094045 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnswl\" (UniqueName: \"kubernetes.io/projected/7e5424af-7d57-4d12-be1c-dcddc1187cdf-kube-api-access-rnswl\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.094067 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-client-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.094220 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-metrics-server-audit-profiles\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.094529 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195479 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-client-certs\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7e5424af-7d57-4d12-be1c-dcddc1187cdf-audit-log\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195579 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-server-tls\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195611 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-client-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195635 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnswl\" (UniqueName: \"kubernetes.io/projected/7e5424af-7d57-4d12-be1c-dcddc1187cdf-kube-api-access-rnswl\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195666 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-metrics-server-audit-profiles\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.195731 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.196182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/7e5424af-7d57-4d12-be1c-dcddc1187cdf-audit-log\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.196821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.197332 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/7e5424af-7d57-4d12-be1c-dcddc1187cdf-metrics-server-audit-profiles\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.202140 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-server-tls\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.202231 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-client-ca-bundle\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.211583 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7e5424af-7d57-4d12-be1c-dcddc1187cdf-secret-metrics-client-certs\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.216094 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnswl\" (UniqueName: \"kubernetes.io/projected/7e5424af-7d57-4d12-be1c-dcddc1187cdf-kube-api-access-rnswl\") pod \"metrics-server-56bfc86c66-lp8fz\" (UID: \"7e5424af-7d57-4d12-be1c-dcddc1187cdf\") " pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.220408 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.508733 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-56bfc86c66-lp8fz"] Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.606695 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" event={"ID":"7e5424af-7d57-4d12-be1c-dcddc1187cdf","Type":"ContainerStarted","Data":"5008ae273e8c2ed3bc215d704a8cb8a32031a8d776caa8504f953ec686197e97"} Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.832020 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs"] Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.842731 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.879981 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs"] Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.884640 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:39:09 crc kubenswrapper[4845]: I0202 10:39:09.884947 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" podUID="f5753406-2b60-4929-85b3-8e01c37218b3" containerName="monitoring-plugin" containerID="cri-o://d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4" gracePeriod=30 Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.009967 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4bd9b974-5ead-4e20-ae4a-724c03f0838d-monitoring-plugin-cert\") pod \"monitoring-plugin-75c47d7fdf-fpzvs\" (UID: \"4bd9b974-5ead-4e20-ae4a-724c03f0838d\") " pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.111331 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4bd9b974-5ead-4e20-ae4a-724c03f0838d-monitoring-plugin-cert\") pod \"monitoring-plugin-75c47d7fdf-fpzvs\" (UID: \"4bd9b974-5ead-4e20-ae4a-724c03f0838d\") " pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.117209 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4bd9b974-5ead-4e20-ae4a-724c03f0838d-monitoring-plugin-cert\") pod \"monitoring-plugin-75c47d7fdf-fpzvs\" (UID: \"4bd9b974-5ead-4e20-ae4a-724c03f0838d\") " pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.202257 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.308456 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-db4cbb94b-xntgm_f5753406-2b60-4929-85b3-8e01c37218b3/monitoring-plugin/0.log" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.308535 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.415568 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert\") pod \"f5753406-2b60-4929-85b3-8e01c37218b3\" (UID: \"f5753406-2b60-4929-85b3-8e01c37218b3\") " Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.419020 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert" (OuterVolumeSpecName: "monitoring-plugin-cert") pod "f5753406-2b60-4929-85b3-8e01c37218b3" (UID: "f5753406-2b60-4929-85b3-8e01c37218b3"). InnerVolumeSpecName "monitoring-plugin-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.431753 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs"] Feb 02 10:39:10 crc kubenswrapper[4845]: W0202 10:39:10.437089 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd9b974_5ead_4e20_ae4a_724c03f0838d.slice/crio-c08c16b57884b84cb1b2d213f565267332bfb8756cb5ff46d283053ffd895bd3 WatchSource:0}: Error finding container c08c16b57884b84cb1b2d213f565267332bfb8756cb5ff46d283053ffd895bd3: Status 404 returned error can't find the container with id c08c16b57884b84cb1b2d213f565267332bfb8756cb5ff46d283053ffd895bd3 Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.516808 4845 reconciler_common.go:293] "Volume detached for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f5753406-2b60-4929-85b3-8e01c37218b3-monitoring-plugin-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.612746 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" event={"ID":"7e5424af-7d57-4d12-be1c-dcddc1187cdf","Type":"ContainerStarted","Data":"185fe07283a7c1977eb081babf13cfa850cf10bbd1e0d246cfdc0846b9337db7"} Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.614289 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" event={"ID":"4bd9b974-5ead-4e20-ae4a-724c03f0838d","Type":"ContainerStarted","Data":"0f067f4506778e52e1cf90e598245458d2f6aff5c06048ae0bdd3f8ee36b053e"} Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.614324 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" event={"ID":"4bd9b974-5ead-4e20-ae4a-724c03f0838d","Type":"ContainerStarted","Data":"c08c16b57884b84cb1b2d213f565267332bfb8756cb5ff46d283053ffd895bd3"} Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617119 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-db4cbb94b-xntgm_f5753406-2b60-4929-85b3-8e01c37218b3/monitoring-plugin/0.log" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617162 4845 generic.go:334] "Generic (PLEG): container finished" podID="f5753406-2b60-4929-85b3-8e01c37218b3" containerID="d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4" exitCode=2 Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617185 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" event={"ID":"f5753406-2b60-4929-85b3-8e01c37218b3","Type":"ContainerDied","Data":"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4"} Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617204 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" event={"ID":"f5753406-2b60-4929-85b3-8e01c37218b3","Type":"ContainerDied","Data":"7967ac4b496b366ce99f5e32cc502e0dcf108c64a09981d769ca1e0919a03dd6"} Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617223 4845 scope.go:117] "RemoveContainer" containerID="d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.617242 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.633820 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" podStartSLOduration=2.633802236 podStartE2EDuration="2.633802236s" podCreationTimestamp="2026-02-02 10:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:39:10.628744688 +0000 UTC m=+431.720146158" watchObservedRunningTime="2026-02-02 10:39:10.633802236 +0000 UTC m=+431.725203686" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.639520 4845 scope.go:117] "RemoveContainer" containerID="d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4" Feb 02 10:39:10 crc kubenswrapper[4845]: E0202 10:39:10.640499 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4\": container with ID starting with d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4 not found: ID does not exist" containerID="d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.640561 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4"} err="failed to get container status \"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4\": rpc error: code = NotFound desc = could not find container \"d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4\": container with ID starting with d6a4724319214b5af799ddf12176c0892718b0b92f3f9bc2e5eb114ce5ea3ae4 not found: ID does not exist" Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.657227 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:39:10 crc kubenswrapper[4845]: I0202 10:39:10.661513 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/monitoring-plugin-db4cbb94b-xntgm"] Feb 02 10:39:11 crc kubenswrapper[4845]: I0202 10:39:11.720266 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5753406-2b60-4929-85b3-8e01c37218b3" path="/var/lib/kubelet/pods/f5753406-2b60-4929-85b3-8e01c37218b3/volumes" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.203551 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.209029 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.226576 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-75c47d7fdf-fpzvs" podStartSLOduration=11.226549728 podStartE2EDuration="11.226549728s" podCreationTimestamp="2026-02-02 10:39:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:39:11.635993541 +0000 UTC m=+432.727394991" watchObservedRunningTime="2026-02-02 10:39:20.226549728 +0000 UTC m=+441.317951208" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.334874 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:39:20 crc kubenswrapper[4845]: E0202 10:39:20.335146 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5753406-2b60-4929-85b3-8e01c37218b3" containerName="monitoring-plugin" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.335158 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5753406-2b60-4929-85b3-8e01c37218b3" containerName="monitoring-plugin" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.335316 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5753406-2b60-4929-85b3-8e01c37218b3" containerName="monitoring-plugin" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.335919 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.355375 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524584 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9qjk\" (UniqueName: \"kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524736 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524783 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524814 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524841 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524869 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.524942 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627027 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627119 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627153 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627187 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627276 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.627395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9qjk\" (UniqueName: \"kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.628825 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.628825 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.629009 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.629158 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.634036 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.634358 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.652440 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9qjk\" (UniqueName: \"kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk\") pod \"console-dd6cc54dd-nz852\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.660724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:20 crc kubenswrapper[4845]: I0202 10:39:20.856060 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:39:21 crc kubenswrapper[4845]: I0202 10:39:21.685804 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd6cc54dd-nz852" event={"ID":"760b8b36-f06d-49ac-9de5-72b222f509d0","Type":"ContainerStarted","Data":"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d"} Feb 02 10:39:21 crc kubenswrapper[4845]: I0202 10:39:21.686157 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd6cc54dd-nz852" event={"ID":"760b8b36-f06d-49ac-9de5-72b222f509d0","Type":"ContainerStarted","Data":"3bdbcedf982f353d1f33f9e0800794674194143b9b56bad7bb0604d71624ce6e"} Feb 02 10:39:21 crc kubenswrapper[4845]: I0202 10:39:21.713116 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dd6cc54dd-nz852" podStartSLOduration=1.713097883 podStartE2EDuration="1.713097883s" podCreationTimestamp="2026-02-02 10:39:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:39:21.710253309 +0000 UTC m=+442.801654799" watchObservedRunningTime="2026-02-02 10:39:21.713097883 +0000 UTC m=+442.804499333" Feb 02 10:39:29 crc kubenswrapper[4845]: I0202 10:39:29.221266 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:29 crc kubenswrapper[4845]: I0202 10:39:29.221846 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:29 crc kubenswrapper[4845]: I0202 10:39:29.226955 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:29 crc kubenswrapper[4845]: I0202 10:39:29.737875 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-56bfc86c66-lp8fz" Feb 02 10:39:30 crc kubenswrapper[4845]: I0202 10:39:30.662303 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:30 crc kubenswrapper[4845]: I0202 10:39:30.662357 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:30 crc kubenswrapper[4845]: I0202 10:39:30.670032 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:30 crc kubenswrapper[4845]: I0202 10:39:30.744681 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:39:30 crc kubenswrapper[4845]: I0202 10:39:30.816197 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:39:55 crc kubenswrapper[4845]: I0202 10:39:55.861577 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-986c9fb64-5l8tt" podUID="615f8561-b519-43b6-8864-9b1275443e98" containerName="console" containerID="cri-o://9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38" gracePeriod=15 Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.209990 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-986c9fb64-5l8tt_615f8561-b519-43b6-8864-9b1275443e98/console/0.log" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.210417 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381140 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381334 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381368 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381425 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381447 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhj26\" (UniqueName: \"kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381719 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.381749 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config\") pod \"615f8561-b519-43b6-8864-9b1275443e98\" (UID: \"615f8561-b519-43b6-8864-9b1275443e98\") " Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.384580 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca" (OuterVolumeSpecName: "service-ca") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.384996 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.385424 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.386128 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config" (OuterVolumeSpecName: "console-config") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.387318 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.388109 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.388630 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26" (OuterVolumeSpecName: "kube-api-access-mhj26") pod "615f8561-b519-43b6-8864-9b1275443e98" (UID: "615f8561-b519-43b6-8864-9b1275443e98"). InnerVolumeSpecName "kube-api-access-mhj26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483724 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483756 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhj26\" (UniqueName: \"kubernetes.io/projected/615f8561-b519-43b6-8864-9b1275443e98-kube-api-access-mhj26\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483769 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483778 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483787 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/615f8561-b519-43b6-8864-9b1275443e98-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483795 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.483803 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/615f8561-b519-43b6-8864-9b1275443e98-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.906312 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-986c9fb64-5l8tt_615f8561-b519-43b6-8864-9b1275443e98/console/0.log" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.907108 4845 generic.go:334] "Generic (PLEG): container finished" podID="615f8561-b519-43b6-8864-9b1275443e98" containerID="9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38" exitCode=2 Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.907144 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-986c9fb64-5l8tt" event={"ID":"615f8561-b519-43b6-8864-9b1275443e98","Type":"ContainerDied","Data":"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38"} Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.907163 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-986c9fb64-5l8tt" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.907183 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-986c9fb64-5l8tt" event={"ID":"615f8561-b519-43b6-8864-9b1275443e98","Type":"ContainerDied","Data":"3fbb867d7ca80df191b9729b7982330f959b62e993162b6e2d6a5e4f58aec846"} Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.907200 4845 scope.go:117] "RemoveContainer" containerID="9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.925163 4845 scope.go:117] "RemoveContainer" containerID="9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38" Feb 02 10:39:56 crc kubenswrapper[4845]: E0202 10:39:56.925619 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38\": container with ID starting with 9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38 not found: ID does not exist" containerID="9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.925667 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38"} err="failed to get container status \"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38\": rpc error: code = NotFound desc = could not find container \"9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38\": container with ID starting with 9a52fdac0cb24cbbe17aceb8d7148b4a37460fce98376930f490c346042e8c38 not found: ID does not exist" Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.934455 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:39:56 crc kubenswrapper[4845]: I0202 10:39:56.939520 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-986c9fb64-5l8tt"] Feb 02 10:39:57 crc kubenswrapper[4845]: I0202 10:39:57.720871 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615f8561-b519-43b6-8864-9b1275443e98" path="/var/lib/kubelet/pods/615f8561-b519-43b6-8864-9b1275443e98/volumes" Feb 02 10:40:46 crc kubenswrapper[4845]: I0202 10:40:46.237997 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:40:46 crc kubenswrapper[4845]: I0202 10:40:46.238643 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:16 crc kubenswrapper[4845]: I0202 10:41:16.237795 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:41:16 crc kubenswrapper[4845]: I0202 10:41:16.238378 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.330877 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw"] Feb 02 10:41:29 crc kubenswrapper[4845]: E0202 10:41:29.331684 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615f8561-b519-43b6-8864-9b1275443e98" containerName="console" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.331700 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="615f8561-b519-43b6-8864-9b1275443e98" containerName="console" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.331832 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="615f8561-b519-43b6-8864-9b1275443e98" containerName="console" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.332723 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.334905 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.353627 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw"] Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.526696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.526777 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.526812 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znrpl\" (UniqueName: \"kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.628958 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.629029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.629063 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znrpl\" (UniqueName: \"kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.629427 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.629502 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.649282 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znrpl\" (UniqueName: \"kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.651782 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:29 crc kubenswrapper[4845]: I0202 10:41:29.881353 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw"] Feb 02 10:41:30 crc kubenswrapper[4845]: I0202 10:41:30.508266 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerStarted","Data":"97729e603ef0e8003e95db2d6b3b5d9b000b47fc6a0f065d959728f4b291e06f"} Feb 02 10:41:30 crc kubenswrapper[4845]: I0202 10:41:30.508321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerStarted","Data":"af01be384acc5c4c309613560eaef002ab894f86be6972102ca68e7f9316fbb7"} Feb 02 10:41:31 crc kubenswrapper[4845]: I0202 10:41:31.514541 4845 generic.go:334] "Generic (PLEG): container finished" podID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerID="97729e603ef0e8003e95db2d6b3b5d9b000b47fc6a0f065d959728f4b291e06f" exitCode=0 Feb 02 10:41:31 crc kubenswrapper[4845]: I0202 10:41:31.514607 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerDied","Data":"97729e603ef0e8003e95db2d6b3b5d9b000b47fc6a0f065d959728f4b291e06f"} Feb 02 10:41:31 crc kubenswrapper[4845]: I0202 10:41:31.516771 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:41:33 crc kubenswrapper[4845]: I0202 10:41:33.528051 4845 generic.go:334] "Generic (PLEG): container finished" podID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerID="99242635b3545b38c06ae8cd903b16c6a2e4f31e02fd592ca7f3bde980c6beed" exitCode=0 Feb 02 10:41:33 crc kubenswrapper[4845]: I0202 10:41:33.528152 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerDied","Data":"99242635b3545b38c06ae8cd903b16c6a2e4f31e02fd592ca7f3bde980c6beed"} Feb 02 10:41:34 crc kubenswrapper[4845]: I0202 10:41:34.538358 4845 generic.go:334] "Generic (PLEG): container finished" podID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerID="06718d605415f80d8e16cf0b3a290439b070e84c0fec256aa8f7f7566836a91c" exitCode=0 Feb 02 10:41:34 crc kubenswrapper[4845]: I0202 10:41:34.538411 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerDied","Data":"06718d605415f80d8e16cf0b3a290439b070e84c0fec256aa8f7f7566836a91c"} Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.813351 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.920606 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znrpl\" (UniqueName: \"kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl\") pod \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.920693 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util\") pod \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.921068 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle\") pod \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\" (UID: \"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907\") " Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.923294 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle" (OuterVolumeSpecName: "bundle") pod "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" (UID: "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.928026 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl" (OuterVolumeSpecName: "kube-api-access-znrpl") pod "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" (UID: "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907"). InnerVolumeSpecName "kube-api-access-znrpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:35 crc kubenswrapper[4845]: I0202 10:41:35.934990 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util" (OuterVolumeSpecName: "util") pod "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" (UID: "fe3cf6fe-df9c-4484-a6af-75fe0b5fa907"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.022904 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znrpl\" (UniqueName: \"kubernetes.io/projected/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-kube-api-access-znrpl\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.022939 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.022948 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fe3cf6fe-df9c-4484-a6af-75fe0b5fa907-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.569172 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" event={"ID":"fe3cf6fe-df9c-4484-a6af-75fe0b5fa907","Type":"ContainerDied","Data":"af01be384acc5c4c309613560eaef002ab894f86be6972102ca68e7f9316fbb7"} Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.569229 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af01be384acc5c4c309613560eaef002ab894f86be6972102ca68e7f9316fbb7" Feb 02 10:41:36 crc kubenswrapper[4845]: I0202 10:41:36.569271 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.271855 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.386383 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfgfq\" (UniqueName: \"kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387161 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387179 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387211 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387269 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387322 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387361 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle\") pod \"ec05463b-fba2-442c-9ba3-893de7b61f92\" (UID: \"ec05463b-fba2-442c-9ba3-893de7b61f92\") " Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387623 4845 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-configmap-kubelet-serving-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.387803 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles" (OuterVolumeSpecName: "metrics-server-audit-profiles") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "metrics-server-audit-profiles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.388632 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log" (OuterVolumeSpecName: "audit-log") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "audit-log". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.393176 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls" (OuterVolumeSpecName: "secret-metrics-server-tls") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "secret-metrics-server-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.393266 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle" (OuterVolumeSpecName: "client-ca-bundle") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "client-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.393614 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.393907 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq" (OuterVolumeSpecName: "kube-api-access-zfgfq") pod "ec05463b-fba2-442c-9ba3-893de7b61f92" (UID: "ec05463b-fba2-442c-9ba3-893de7b61f92"). InnerVolumeSpecName "kube-api-access-zfgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.495919 4845 reconciler_common.go:293] "Volume detached for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-client-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.495963 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfgfq\" (UniqueName: \"kubernetes.io/projected/ec05463b-fba2-442c-9ba3-893de7b61f92-kube-api-access-zfgfq\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.495981 4845 reconciler_common.go:293] "Volume detached for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/ec05463b-fba2-442c-9ba3-893de7b61f92-metrics-server-audit-profiles\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.495996 4845 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-server-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.496010 4845 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ec05463b-fba2-442c-9ba3-893de7b61f92-secret-metrics-client-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.496022 4845 reconciler_common.go:293] "Volume detached for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/ec05463b-fba2-442c-9ba3-893de7b61f92-audit-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.585699 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec05463b-fba2-442c-9ba3-893de7b61f92" containerID="a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b" exitCode=0 Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.585805 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" event={"ID":"ec05463b-fba2-442c-9ba3-893de7b61f92","Type":"ContainerDied","Data":"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b"} Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.585864 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" event={"ID":"ec05463b-fba2-442c-9ba3-893de7b61f92","Type":"ContainerDied","Data":"9c81f12ccf75970d5f3c58822634af178231901d9efb8c97f05d144b2887fc22"} Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.585913 4845 scope.go:117] "RemoveContainer" containerID="a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.586075 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-76d65679c8-9dw7p" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.621129 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.621184 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/metrics-server-76d65679c8-9dw7p"] Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.621292 4845 scope.go:117] "RemoveContainer" containerID="a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b" Feb 02 10:41:39 crc kubenswrapper[4845]: E0202 10:41:39.622857 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b\": container with ID starting with a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b not found: ID does not exist" containerID="a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.622935 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b"} err="failed to get container status \"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b\": rpc error: code = NotFound desc = could not find container \"a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b\": container with ID starting with a4dacca20c6457fa0330baeb2e48ff27f134770904167edbaba74443ba0efc3b not found: ID does not exist" Feb 02 10:41:39 crc kubenswrapper[4845]: I0202 10:41:39.721501 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec05463b-fba2-442c-9ba3-893de7b61f92" path="/var/lib/kubelet/pods/ec05463b-fba2-442c-9ba3-893de7b61f92/volumes" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.181945 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sh5vd"] Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182482 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="nbdb" containerID="cri-o://409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182505 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182613 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-acl-logging" containerID="cri-o://2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182668 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="sbdb" containerID="cri-o://02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182651 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="northd" containerID="cri-o://74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182644 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-node" containerID="cri-o://8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.182979 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-controller" containerID="cri-o://36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.246135 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" containerID="cri-o://e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" gracePeriod=30 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.595086 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/2.log" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.595598 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/1.log" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.595649 4845 generic.go:334] "Generic (PLEG): container finished" podID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" containerID="276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b" exitCode=2 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.595723 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerDied","Data":"276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b"} Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.595773 4845 scope.go:117] "RemoveContainer" containerID="a7524e10c6d21267ccb31b5667ff6f876f7954e0b4cfca364afc003cb525513f" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.596385 4845 scope.go:117] "RemoveContainer" containerID="276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b" Feb 02 10:41:40 crc kubenswrapper[4845]: E0202 10:41:40.596707 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kzwst_openshift-multus(310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3)\"" pod="openshift-multus/multus-kzwst" podUID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.598855 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovnkube-controller/3.log" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.600906 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-acl-logging/0.log" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601350 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-controller/0.log" Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601774 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" exitCode=0 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601791 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" exitCode=143 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601801 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" exitCode=143 Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601820 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601843 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.601854 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} Feb 02 10:41:40 crc kubenswrapper[4845]: I0202 10:41:40.660795 4845 scope.go:117] "RemoveContainer" containerID="b1031f0edf0ca31e33f0e6971035d19d313184cece6b93477345110db5db994a" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.431645 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-acl-logging/0.log" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.432337 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-controller/0.log" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.433573 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524713 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524792 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524822 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524843 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524846 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524919 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524951 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524950 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524975 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.524984 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525002 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525030 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525066 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdgfw\" (UniqueName: \"kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525102 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525131 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525163 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525192 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525217 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525243 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525265 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525284 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525308 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525325 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch\") pod \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\" (UID: \"7b93b041-3f3f-47ba-a9d4-d09de1b326dc\") " Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525525 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525593 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525629 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525656 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525672 4845 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525688 4845 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525701 4845 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525712 4845 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525747 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525778 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525805 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525828 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash" (OuterVolumeSpecName: "host-slash") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525853 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525876 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket" (OuterVolumeSpecName: "log-socket") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525917 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log" (OuterVolumeSpecName: "node-log") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525925 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.525942 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.526195 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.526419 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539397 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zgmmp"] Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539661 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539675 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539684 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-acl-logging" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539689 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-acl-logging" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539696 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="pull" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539702 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="pull" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539711 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539717 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539725 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539730 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539738 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="util" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539743 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="util" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539751 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539756 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539766 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539773 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539781 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kubecfg-setup" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539786 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kubecfg-setup" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539793 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="northd" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539798 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="northd" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539805 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="nbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539810 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="nbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539816 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-node" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539821 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-node" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539832 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="extract" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539837 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="extract" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539849 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec05463b-fba2-442c-9ba3-893de7b61f92" containerName="metrics-server" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539854 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec05463b-fba2-442c-9ba3-893de7b61f92" containerName="metrics-server" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.539862 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="sbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539867 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="sbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.539991 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540006 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540014 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="northd" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540026 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-node" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540036 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540046 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-acl-logging" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540053 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="nbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540060 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="sbdb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540067 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3cf6fe-df9c-4484-a6af-75fe0b5fa907" containerName="extract" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540076 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540082 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540090 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovn-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540099 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec05463b-fba2-442c-9ba3-893de7b61f92" containerName="metrics-server" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.540202 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540208 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.540218 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540223 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.540321 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerName="ovnkube-controller" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.542379 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.542363 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw" (OuterVolumeSpecName: "kube-api-access-wdgfw") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "kube-api-access-wdgfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.542386 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.556449 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7b93b041-3f3f-47ba-a9d4-d09de1b326dc" (UID: "7b93b041-3f3f-47ba-a9d4-d09de1b326dc"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.611437 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/2.log" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.615778 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-acl-logging/0.log" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616411 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sh5vd_7b93b041-3f3f-47ba-a9d4-d09de1b326dc/ovn-controller/0.log" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616767 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616800 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616812 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616823 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616835 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" exitCode=0 Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.616899 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617030 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617106 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617175 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617190 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617200 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sh5vd" event={"ID":"7b93b041-3f3f-47ba-a9d4-d09de1b326dc","Type":"ContainerDied","Data":"000add16087c8ea520a073a5f979801fd8148c67d21762dc28e5704d78c31411"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617214 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617228 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617235 4845 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.617278 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628105 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-bin\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628171 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovn-node-metrics-cert\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628209 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-systemd-units\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628236 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628263 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-ovn\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628285 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-node-log\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628384 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-systemd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628404 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-var-lib-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628422 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-kubelet\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628441 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-log-socket\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628465 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-netns\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628484 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-script-lib\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628507 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-env-overrides\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628531 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9gz\" (UniqueName: \"kubernetes.io/projected/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-kube-api-access-ch9gz\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628556 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-netd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628585 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-slash\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628641 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-config\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628666 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-etc-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628724 4845 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628737 4845 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628750 4845 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628762 4845 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628773 4845 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628783 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdgfw\" (UniqueName: \"kubernetes.io/projected/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-kube-api-access-wdgfw\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628794 4845 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628805 4845 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628816 4845 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628828 4845 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628841 4845 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-slash\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628850 4845 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628863 4845 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-log-socket\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628872 4845 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-node-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.628902 4845 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b93b041-3f3f-47ba-a9d4-d09de1b326dc-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.643496 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.671556 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.690331 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sh5vd"] Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.691732 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.724731 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.725366 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sh5vd"] Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-netns\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729663 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-script-lib\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729697 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-env-overrides\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729713 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9gz\" (UniqueName: \"kubernetes.io/projected/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-kube-api-access-ch9gz\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-netd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729770 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729778 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-netns\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729811 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-slash\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.729933 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-config\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-etc-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730089 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-bin\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730129 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovn-node-metrics-cert\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730188 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-systemd-units\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730253 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730303 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-ovn\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730384 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730471 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-node-log\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730501 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-systemd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730559 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-kubelet\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730583 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-var-lib-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-log-socket\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730913 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-etc-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.730965 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-bin\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731156 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-script-lib\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731442 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-systemd-units\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731474 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-cni-netd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731585 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-kubelet\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731612 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-node-log\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731629 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731693 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-var-lib-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731721 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-slash\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731744 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-host-run-ovn-kubernetes\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731788 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-openvswitch\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731958 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-log-socket\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.731982 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-systemd\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.732310 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovnkube-config\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.732351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-run-ovn\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.732357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-env-overrides\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.736462 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-ovn-node-metrics-cert\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.761675 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9gz\" (UniqueName: \"kubernetes.io/projected/d82eed2b-a080-46e8-86d6-9fd5fc6ee721-kube-api-access-ch9gz\") pod \"ovnkube-node-zgmmp\" (UID: \"d82eed2b-a080-46e8-86d6-9fd5fc6ee721\") " pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.775091 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.807114 4845 scope.go:117] "RemoveContainer" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.864582 4845 scope.go:117] "RemoveContainer" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.890962 4845 scope.go:117] "RemoveContainer" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.908268 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.908684 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.915269 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.915324 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} err="failed to get container status \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.915355 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.917449 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.917480 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} err="failed to get container status \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.917498 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.918126 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918174 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} err="failed to get container status \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918208 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.918586 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918614 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} err="failed to get container status \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918630 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.918950 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918975 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} err="failed to get container status \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.918990 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.919245 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.919266 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} err="failed to get container status \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.919280 4845 scope.go:117] "RemoveContainer" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.919647 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": container with ID starting with 2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3 not found: ID does not exist" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.919682 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} err="failed to get container status \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": rpc error: code = NotFound desc = could not find container \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": container with ID starting with 2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.919700 4845 scope.go:117] "RemoveContainer" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.921557 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": container with ID starting with 36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb not found: ID does not exist" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.921581 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} err="failed to get container status \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": rpc error: code = NotFound desc = could not find container \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": container with ID starting with 36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.921597 4845 scope.go:117] "RemoveContainer" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: E0202 10:41:41.921875 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": container with ID starting with 9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1 not found: ID does not exist" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.921918 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} err="failed to get container status \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": rpc error: code = NotFound desc = could not find container \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": container with ID starting with 9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.921931 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922329 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} err="failed to get container status \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922352 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922615 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} err="failed to get container status \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922647 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922911 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} err="failed to get container status \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.922931 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923143 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} err="failed to get container status \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923170 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923498 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} err="failed to get container status \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923517 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923866 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} err="failed to get container status \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.923900 4845 scope.go:117] "RemoveContainer" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.924117 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} err="failed to get container status \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": rpc error: code = NotFound desc = could not find container \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": container with ID starting with 2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.924137 4845 scope.go:117] "RemoveContainer" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.924967 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} err="failed to get container status \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": rpc error: code = NotFound desc = could not find container \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": container with ID starting with 36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.924999 4845 scope.go:117] "RemoveContainer" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925220 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} err="failed to get container status \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": rpc error: code = NotFound desc = could not find container \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": container with ID starting with 9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925252 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925489 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} err="failed to get container status \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925509 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925700 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} err="failed to get container status \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925721 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925949 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} err="failed to get container status \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.925967 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.926336 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} err="failed to get container status \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.926400 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.926777 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} err="failed to get container status \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.926802 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.927211 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} err="failed to get container status \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.927232 4845 scope.go:117] "RemoveContainer" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.931312 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} err="failed to get container status \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": rpc error: code = NotFound desc = could not find container \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": container with ID starting with 2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.931344 4845 scope.go:117] "RemoveContainer" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.931910 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} err="failed to get container status \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": rpc error: code = NotFound desc = could not find container \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": container with ID starting with 36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.931929 4845 scope.go:117] "RemoveContainer" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.936726 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} err="failed to get container status \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": rpc error: code = NotFound desc = could not find container \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": container with ID starting with 9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.936765 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.937330 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} err="failed to get container status \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.937387 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.937778 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} err="failed to get container status \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.937829 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.942389 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} err="failed to get container status \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.942519 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.946117 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} err="failed to get container status \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.946218 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.948020 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} err="failed to get container status \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.948119 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.951556 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} err="failed to get container status \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.951651 4845 scope.go:117] "RemoveContainer" containerID="2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.953327 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3"} err="failed to get container status \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": rpc error: code = NotFound desc = could not find container \"2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3\": container with ID starting with 2892ff3b75844390937d52755502815dfadf7751fd15ada1334e1baee4346cf3 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.953416 4845 scope.go:117] "RemoveContainer" containerID="36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.954118 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb"} err="failed to get container status \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": rpc error: code = NotFound desc = could not find container \"36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb\": container with ID starting with 36972d6206bd60feed48b6f9dbccaf4f05d994ea7bdd06336644428c25b2c3fb not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.954215 4845 scope.go:117] "RemoveContainer" containerID="9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.954519 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1"} err="failed to get container status \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": rpc error: code = NotFound desc = could not find container \"9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1\": container with ID starting with 9e9066826bc923668c4573d1732c3256ef6cfc03494d40eb8499775ab5658ae1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.954610 4845 scope.go:117] "RemoveContainer" containerID="e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.954982 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0"} err="failed to get container status \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": rpc error: code = NotFound desc = could not find container \"e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0\": container with ID starting with e82fe127f51288ef1b1322c9522ff12a2264fb3652d9a6840e4c07868aeae7b0 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.955074 4845 scope.go:117] "RemoveContainer" containerID="02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.955391 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1"} err="failed to get container status \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": rpc error: code = NotFound desc = could not find container \"02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1\": container with ID starting with 02fa53ffb692bec680de2862735d2af4b4cbfe1ae757617ba50bfcf582de44d1 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.955483 4845 scope.go:117] "RemoveContainer" containerID="409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956058 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658"} err="failed to get container status \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": rpc error: code = NotFound desc = could not find container \"409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658\": container with ID starting with 409f9ac81e645675e175199a51cb31223b386d8e2d985f1a59a1de038b18f658 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956146 4845 scope.go:117] "RemoveContainer" containerID="74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956387 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397"} err="failed to get container status \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": rpc error: code = NotFound desc = could not find container \"74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397\": container with ID starting with 74f739ca28bbe3712cf8954196b49eeef72d7361f88fb5f5d686cc61cdaf3397 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956465 4845 scope.go:117] "RemoveContainer" containerID="79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956679 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73"} err="failed to get container status \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": rpc error: code = NotFound desc = could not find container \"79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73\": container with ID starting with 79fc5049b517c8f029fff8f78cd5164fe30480307280fa2f2e206e773a2f3f73 not found: ID does not exist" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956749 4845 scope.go:117] "RemoveContainer" containerID="8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7" Feb 02 10:41:41 crc kubenswrapper[4845]: I0202 10:41:41.956985 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7"} err="failed to get container status \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": rpc error: code = NotFound desc = could not find container \"8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7\": container with ID starting with 8943a1af4d3e926e148f5107a7cbbf5b672b1ce0ef2812f9a5c4245b307721c7 not found: ID does not exist" Feb 02 10:41:42 crc kubenswrapper[4845]: I0202 10:41:42.623389 4845 generic.go:334] "Generic (PLEG): container finished" podID="d82eed2b-a080-46e8-86d6-9fd5fc6ee721" containerID="6a8aaa7e5b8588aa254156f4f7413bbe2803ade42897c279b1aeadb4a06b61ae" exitCode=0 Feb 02 10:41:42 crc kubenswrapper[4845]: I0202 10:41:42.624523 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerDied","Data":"6a8aaa7e5b8588aa254156f4f7413bbe2803ade42897c279b1aeadb4a06b61ae"} Feb 02 10:41:42 crc kubenswrapper[4845]: I0202 10:41:42.624630 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"9d078a3ac63aeece2981887726cdf89367123881d3727a55f36dbc54153449ac"} Feb 02 10:41:43 crc kubenswrapper[4845]: I0202 10:41:43.635647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"ef26a1e78c7d193b66cf7baf398381e3eb1b7be1c29fa1744759e6ab00ca8cc3"} Feb 02 10:41:43 crc kubenswrapper[4845]: I0202 10:41:43.635986 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"71a141c05c90158c519861b67deaafa2a2ebb438350eb7ee3b765ef91a7a89a2"} Feb 02 10:41:43 crc kubenswrapper[4845]: I0202 10:41:43.635997 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"50b65571723e5a30dd1e0df31816ba4954e2ae7c6bc7d5ed56b63d90cb6dbd92"} Feb 02 10:41:43 crc kubenswrapper[4845]: I0202 10:41:43.721032 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b93b041-3f3f-47ba-a9d4-d09de1b326dc" path="/var/lib/kubelet/pods/7b93b041-3f3f-47ba-a9d4-d09de1b326dc/volumes" Feb 02 10:41:44 crc kubenswrapper[4845]: I0202 10:41:44.643757 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"a1ee2b8828c8638e143260a38393f97f9f953373ad44d6f6e65ed1217a84a65d"} Feb 02 10:41:44 crc kubenswrapper[4845]: I0202 10:41:44.643796 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"80288c7c9b9ca70cf6f5dbe5776865cec991199f5502d433db89c9713ed05e4e"} Feb 02 10:41:45 crc kubenswrapper[4845]: I0202 10:41:45.653251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"85f1a736b84e9f8d81cf1e8b5aeb90b3c0aaa644b5bd01656d95a11622037880"} Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.237991 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.238405 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.238584 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.239596 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.239944 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582" gracePeriod=600 Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.672791 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582" exitCode=0 Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.672874 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582"} Feb 02 10:41:46 crc kubenswrapper[4845]: I0202 10:41:46.673039 4845 scope.go:117] "RemoveContainer" containerID="df9230c12c17f28801d9b1be21f07e2881dfba8fde329097a5e90d09e1d981f3" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.192747 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27"] Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.194135 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.197637 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.197979 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.198014 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6vc4w" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.209597 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvdx\" (UniqueName: \"kubernetes.io/projected/308cfce2-8d47-45e6-9153-a8cd92a8758b-kube-api-access-8jvdx\") pod \"obo-prometheus-operator-68bc856cb9-rmj27\" (UID: \"308cfce2-8d47-45e6-9153-a8cd92a8758b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.311110 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvdx\" (UniqueName: \"kubernetes.io/projected/308cfce2-8d47-45e6-9153-a8cd92a8758b-kube-api-access-8jvdx\") pod \"obo-prometheus-operator-68bc856cb9-rmj27\" (UID: \"308cfce2-8d47-45e6-9153-a8cd92a8758b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.326766 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466"] Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.327481 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.335786 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qprcl" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.335829 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.335933 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch"] Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.336869 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.354748 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvdx\" (UniqueName: \"kubernetes.io/projected/308cfce2-8d47-45e6-9153-a8cd92a8758b-kube-api-access-8jvdx\") pod \"obo-prometheus-operator-68bc856cb9-rmj27\" (UID: \"308cfce2-8d47-45e6-9153-a8cd92a8758b\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.467321 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.467680 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.467800 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.467910 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.511107 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.526053 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5wvdz"] Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.526825 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.529276 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.531960 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-wdzpz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.569103 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.569581 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.569618 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.569644 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.583057 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.583072 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.583298 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d289096b-a35d-4a41-90a3-cab735629cc7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch\" (UID: \"d289096b-a35d-4a41-90a3-cab735629cc7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.589893 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(9b5e30249634e55d12f4d94f18c5c9c46180df0bc10fd9fad30ffec1f337db76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.589990 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(9b5e30249634e55d12f4d94f18c5c9c46180df0bc10fd9fad30ffec1f337db76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.590024 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(9b5e30249634e55d12f4d94f18c5c9c46180df0bc10fd9fad30ffec1f337db76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.590075 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(9b5e30249634e55d12f4d94f18c5c9c46180df0bc10fd9fad30ffec1f337db76): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" podUID="308cfce2-8d47-45e6-9153-a8cd92a8758b" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.591264 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466\" (UID: \"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.642163 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.669905 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(95c7df11e34e9a6e0e8fc61e4d80c6184f98bece68c2dfc0ae8f4ae1a2580f62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.669982 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(95c7df11e34e9a6e0e8fc61e4d80c6184f98bece68c2dfc0ae8f4ae1a2580f62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.670002 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(95c7df11e34e9a6e0e8fc61e4d80c6184f98bece68c2dfc0ae8f4ae1a2580f62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.670056 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators(631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators(631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(95c7df11e34e9a6e0e8fc61e4d80c6184f98bece68c2dfc0ae8f4ae1a2580f62): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" podUID="631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.670707 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlwr\" (UniqueName: \"kubernetes.io/projected/b75686c5-933f-4f8d-bf87-0229795baf12-kube-api-access-lqlwr\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.670851 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75686c5-933f-4f8d-bf87-0229795baf12-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.681989 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8"} Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.683549 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.689526 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"7507381aef96d67058b511c06ee19db89edff3956425194ae767cae7ad73b260"} Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.714039 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(eaec681cfc3d1f2d2fc8e93809fef2e1bdb458129f9bc7b5f52388a83a08b06f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.714105 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(eaec681cfc3d1f2d2fc8e93809fef2e1bdb458129f9bc7b5f52388a83a08b06f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.714128 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(eaec681cfc3d1f2d2fc8e93809fef2e1bdb458129f9bc7b5f52388a83a08b06f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.714167 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(eaec681cfc3d1f2d2fc8e93809fef2e1bdb458129f9bc7b5f52388a83a08b06f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" podUID="d289096b-a35d-4a41-90a3-cab735629cc7" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.746148 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8hhqb"] Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.747194 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.751406 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qdwh7" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.772357 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlwr\" (UniqueName: \"kubernetes.io/projected/b75686c5-933f-4f8d-bf87-0229795baf12-kube-api-access-lqlwr\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.772610 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75686c5-933f-4f8d-bf87-0229795baf12-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.776242 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75686c5-933f-4f8d-bf87-0229795baf12-observability-operator-tls\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.788034 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlwr\" (UniqueName: \"kubernetes.io/projected/b75686c5-933f-4f8d-bf87-0229795baf12-kube-api-access-lqlwr\") pod \"observability-operator-59bdc8b94-5wvdz\" (UID: \"b75686c5-933f-4f8d-bf87-0229795baf12\") " pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.874304 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.874582 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5dn\" (UniqueName: \"kubernetes.io/projected/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-kube-api-access-jl5dn\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.923507 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.952382 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(c7045a081315465faad1c050ecc58431fdc571ada0833d62e56d2b646cbb6e18): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.952492 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(c7045a081315465faad1c050ecc58431fdc571ada0833d62e56d2b646cbb6e18): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.952513 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(c7045a081315465faad1c050ecc58431fdc571ada0833d62e56d2b646cbb6e18): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:47 crc kubenswrapper[4845]: E0202 10:41:47.952565 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(c7045a081315465faad1c050ecc58431fdc571ada0833d62e56d2b646cbb6e18): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" podUID="b75686c5-933f-4f8d-bf87-0229795baf12" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.975787 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5dn\" (UniqueName: \"kubernetes.io/projected/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-kube-api-access-jl5dn\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.975874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.976746 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-openshift-service-ca\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:47 crc kubenswrapper[4845]: I0202 10:41:47.992221 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5dn\" (UniqueName: \"kubernetes.io/projected/1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec-kube-api-access-jl5dn\") pod \"perses-operator-5bf474d74f-8hhqb\" (UID: \"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec\") " pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:48 crc kubenswrapper[4845]: I0202 10:41:48.070574 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:48 crc kubenswrapper[4845]: E0202 10:41:48.094642 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(c5f1701ce4247eb02ffb62860f88a816cd7f9eee82d508c49315ccc464e483b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:48 crc kubenswrapper[4845]: E0202 10:41:48.094759 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(c5f1701ce4247eb02ffb62860f88a816cd7f9eee82d508c49315ccc464e483b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:48 crc kubenswrapper[4845]: E0202 10:41:48.094840 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(c5f1701ce4247eb02ffb62860f88a816cd7f9eee82d508c49315ccc464e483b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:48 crc kubenswrapper[4845]: E0202 10:41:48.094956 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(c5f1701ce4247eb02ffb62860f88a816cd7f9eee82d508c49315ccc464e483b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" podUID="1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec" Feb 02 10:41:49 crc kubenswrapper[4845]: I0202 10:41:49.705450 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" event={"ID":"d82eed2b-a080-46e8-86d6-9fd5fc6ee721","Type":"ContainerStarted","Data":"128480b8c77f1769c8bc0e796797eacce6318444eb182f13681aba2cdcc29a1f"} Feb 02 10:41:49 crc kubenswrapper[4845]: I0202 10:41:49.706577 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:49 crc kubenswrapper[4845]: I0202 10:41:49.739931 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" podStartSLOduration=8.739911933 podStartE2EDuration="8.739911933s" podCreationTimestamp="2026-02-02 10:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:41:49.737599027 +0000 UTC m=+590.829000497" watchObservedRunningTime="2026-02-02 10:41:49.739911933 +0000 UTC m=+590.831313423" Feb 02 10:41:49 crc kubenswrapper[4845]: I0202 10:41:49.798423 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.472238 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27"] Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.472377 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.472811 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.478329 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8hhqb"] Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.478460 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.478977 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.510644 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch"] Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.510785 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.512560 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.522862 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5wvdz"] Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.523004 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.523495 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.557214 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466"] Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.557351 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.557805 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.586088 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(68996c8508724fbd9016da0eef3c9d68b8b95ad32e83069c6c57fe1b6827f31b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.586179 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(68996c8508724fbd9016da0eef3c9d68b8b95ad32e83069c6c57fe1b6827f31b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.586209 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(68996c8508724fbd9016da0eef3c9d68b8b95ad32e83069c6c57fe1b6827f31b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.586263 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(68996c8508724fbd9016da0eef3c9d68b8b95ad32e83069c6c57fe1b6827f31b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" podUID="308cfce2-8d47-45e6-9153-a8cd92a8758b" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.594386 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(8184fb1fbd2d417cd0328ac22b647b2d4bc1a4fa90f381b6a8890b694ae33c33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.594449 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(8184fb1fbd2d417cd0328ac22b647b2d4bc1a4fa90f381b6a8890b694ae33c33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.594468 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(8184fb1fbd2d417cd0328ac22b647b2d4bc1a4fa90f381b6a8890b694ae33c33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.594507 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(8184fb1fbd2d417cd0328ac22b647b2d4bc1a4fa90f381b6a8890b694ae33c33): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" podUID="1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.627399 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(60064200f0e8007bebbd5fa26eb450fcc94b3711a5203d7da8fd50da9e98ce30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.627561 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(60064200f0e8007bebbd5fa26eb450fcc94b3711a5203d7da8fd50da9e98ce30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.627587 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(60064200f0e8007bebbd5fa26eb450fcc94b3711a5203d7da8fd50da9e98ce30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.627684 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(60064200f0e8007bebbd5fa26eb450fcc94b3711a5203d7da8fd50da9e98ce30): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" podUID="d289096b-a35d-4a41-90a3-cab735629cc7" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.638654 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(71fd7d6d6f430db09f5704c4547c6a1c7123cafbcf322cdbb9a3a82557e58ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.638733 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(71fd7d6d6f430db09f5704c4547c6a1c7123cafbcf322cdbb9a3a82557e58ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.638756 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(71fd7d6d6f430db09f5704c4547c6a1c7123cafbcf322cdbb9a3a82557e58ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.638803 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(71fd7d6d6f430db09f5704c4547c6a1c7123cafbcf322cdbb9a3a82557e58ef2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" podUID="b75686c5-933f-4f8d-bf87-0229795baf12" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.646000 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(e67c8daca8aee1c3b4da005245135281896d4a42360c3395f237374992bf3702): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.646066 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(e67c8daca8aee1c3b4da005245135281896d4a42360c3395f237374992bf3702): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.646089 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(e67c8daca8aee1c3b4da005245135281896d4a42360c3395f237374992bf3702): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:41:50 crc kubenswrapper[4845]: E0202 10:41:50.646129 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators(631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators(631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_openshift-operators_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413_0(e67c8daca8aee1c3b4da005245135281896d4a42360c3395f237374992bf3702): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" podUID="631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.710301 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.710399 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:50 crc kubenswrapper[4845]: I0202 10:41:50.756451 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:41:51 crc kubenswrapper[4845]: I0202 10:41:51.712536 4845 scope.go:117] "RemoveContainer" containerID="276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b" Feb 02 10:41:51 crc kubenswrapper[4845]: E0202 10:41:51.713388 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-kzwst_openshift-multus(310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3)\"" pod="openshift-multus/multus-kzwst" podUID="310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3" Feb 02 10:42:01 crc kubenswrapper[4845]: I0202 10:42:01.713485 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:01 crc kubenswrapper[4845]: I0202 10:42:01.715472 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:01 crc kubenswrapper[4845]: E0202 10:42:01.742948 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(33b261f34c416aaf3e4f914aede8f77e42a99e5f63df355c9532db4b9e51a7d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:42:01 crc kubenswrapper[4845]: E0202 10:42:01.743076 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(33b261f34c416aaf3e4f914aede8f77e42a99e5f63df355c9532db4b9e51a7d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:01 crc kubenswrapper[4845]: E0202 10:42:01.743109 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(33b261f34c416aaf3e4f914aede8f77e42a99e5f63df355c9532db4b9e51a7d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:01 crc kubenswrapper[4845]: E0202 10:42:01.743184 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-8hhqb_openshift-operators(1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-8hhqb_openshift-operators_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec_0(33b261f34c416aaf3e4f914aede8f77e42a99e5f63df355c9532db4b9e51a7d9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" podUID="1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec" Feb 02 10:42:02 crc kubenswrapper[4845]: I0202 10:42:02.711642 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:02 crc kubenswrapper[4845]: I0202 10:42:02.714474 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:02 crc kubenswrapper[4845]: I0202 10:42:02.714763 4845 scope.go:117] "RemoveContainer" containerID="276b266e719606f3b154e5d01310e24320f7c03059107da4aad492d36a95867b" Feb 02 10:42:02 crc kubenswrapper[4845]: I0202 10:42:02.715815 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:02 crc kubenswrapper[4845]: I0202 10:42:02.718432 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.796000 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(b89f8f8709d56d3be6ff6b6645b0e14829f11b1502e9f4cf869cef757ace04de): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.796098 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(b89f8f8709d56d3be6ff6b6645b0e14829f11b1502e9f4cf869cef757ace04de): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.796131 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(b89f8f8709d56d3be6ff6b6645b0e14829f11b1502e9f4cf869cef757ace04de): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.796194 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators(d289096b-a35d-4a41-90a3-cab735629cc7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_openshift-operators_d289096b-a35d-4a41-90a3-cab735629cc7_0(b89f8f8709d56d3be6ff6b6645b0e14829f11b1502e9f4cf869cef757ace04de): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" podUID="d289096b-a35d-4a41-90a3-cab735629cc7" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.836028 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(285e78de503dbd5bcc7a75f4a7aecc5a78f218771235523977c6c99aa441d169): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.836102 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(285e78de503dbd5bcc7a75f4a7aecc5a78f218771235523977c6c99aa441d169): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.836123 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(285e78de503dbd5bcc7a75f4a7aecc5a78f218771235523977c6c99aa441d169): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:02 crc kubenswrapper[4845]: E0202 10:42:02.836195 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators(308cfce2-8d47-45e6-9153-a8cd92a8758b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-rmj27_openshift-operators_308cfce2-8d47-45e6-9153-a8cd92a8758b_0(285e78de503dbd5bcc7a75f4a7aecc5a78f218771235523977c6c99aa441d169): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" podUID="308cfce2-8d47-45e6-9153-a8cd92a8758b" Feb 02 10:42:03 crc kubenswrapper[4845]: I0202 10:42:03.711959 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:03 crc kubenswrapper[4845]: I0202 10:42:03.712855 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:03 crc kubenswrapper[4845]: E0202 10:42:03.741029 4845 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(b0a95bce9ea9cd5d8076a5251ac1a9abd3b11448e537696e180b7232316cf05b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:42:03 crc kubenswrapper[4845]: E0202 10:42:03.741118 4845 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(b0a95bce9ea9cd5d8076a5251ac1a9abd3b11448e537696e180b7232316cf05b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:03 crc kubenswrapper[4845]: E0202 10:42:03.741152 4845 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(b0a95bce9ea9cd5d8076a5251ac1a9abd3b11448e537696e180b7232316cf05b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:03 crc kubenswrapper[4845]: E0202 10:42:03.741209 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-5wvdz_openshift-operators(b75686c5-933f-4f8d-bf87-0229795baf12)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-5wvdz_openshift-operators_b75686c5-933f-4f8d-bf87-0229795baf12_0(b0a95bce9ea9cd5d8076a5251ac1a9abd3b11448e537696e180b7232316cf05b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" podUID="b75686c5-933f-4f8d-bf87-0229795baf12" Feb 02 10:42:03 crc kubenswrapper[4845]: I0202 10:42:03.793093 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kzwst_310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3/kube-multus/2.log" Feb 02 10:42:03 crc kubenswrapper[4845]: I0202 10:42:03.793149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kzwst" event={"ID":"310f06ec-b9c5-40c9-aeb9-a6e4ef5304c3","Type":"ContainerStarted","Data":"2b4a07b1d9171eb411a82de7135210292ed9c2a5fc790f6fd74c2e539f900185"} Feb 02 10:42:04 crc kubenswrapper[4845]: I0202 10:42:04.712446 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:42:04 crc kubenswrapper[4845]: I0202 10:42:04.713482 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" Feb 02 10:42:05 crc kubenswrapper[4845]: I0202 10:42:05.113471 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466"] Feb 02 10:42:05 crc kubenswrapper[4845]: I0202 10:42:05.810826 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" event={"ID":"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413","Type":"ContainerStarted","Data":"08d226cd3318e0a212080dd634fff2b9f5e2c691b3e1aed077d9bf1414f241b5"} Feb 02 10:42:11 crc kubenswrapper[4845]: I0202 10:42:11.933783 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zgmmp" Feb 02 10:42:12 crc kubenswrapper[4845]: I0202 10:42:12.711737 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:12 crc kubenswrapper[4845]: I0202 10:42:12.712176 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:13 crc kubenswrapper[4845]: I0202 10:42:13.887825 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-8hhqb"] Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.712134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.712876 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.898508 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" event={"ID":"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec","Type":"ContainerStarted","Data":"3afef2be0e2f5a4237a498c299928a513a2258aa23d6c7c000241f200cab02ee"} Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.900024 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" event={"ID":"631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413","Type":"ContainerStarted","Data":"484aaa9298f4a42b09d3f3da27122bd50238e83d649c292b63dcd669d5e1c3f7"} Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.905610 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch"] Feb 02 10:42:14 crc kubenswrapper[4845]: W0202 10:42:14.907298 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd289096b_a35d_4a41_90a3_cab735629cc7.slice/crio-a4328a34ee8d46edafc84e0b5ccff02a1fb90494cbe6f4f5b5d82b1e9a40b538 WatchSource:0}: Error finding container a4328a34ee8d46edafc84e0b5ccff02a1fb90494cbe6f4f5b5d82b1e9a40b538: Status 404 returned error can't find the container with id a4328a34ee8d46edafc84e0b5ccff02a1fb90494cbe6f4f5b5d82b1e9a40b538 Feb 02 10:42:14 crc kubenswrapper[4845]: I0202 10:42:14.935195 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-kx466" podStartSLOduration=18.652121897 podStartE2EDuration="27.93517195s" podCreationTimestamp="2026-02-02 10:41:47 +0000 UTC" firstStartedPulling="2026-02-02 10:42:05.120803312 +0000 UTC m=+606.212204762" lastFinishedPulling="2026-02-02 10:42:14.403853365 +0000 UTC m=+615.495254815" observedRunningTime="2026-02-02 10:42:14.933406699 +0000 UTC m=+616.024808179" watchObservedRunningTime="2026-02-02 10:42:14.93517195 +0000 UTC m=+616.026573410" Feb 02 10:42:15 crc kubenswrapper[4845]: I0202 10:42:15.715449 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:15 crc kubenswrapper[4845]: I0202 10:42:15.719447 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:15 crc kubenswrapper[4845]: I0202 10:42:15.907933 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" event={"ID":"d289096b-a35d-4a41-90a3-cab735629cc7","Type":"ContainerStarted","Data":"10bc03154c57c5c07d663e35caee8425747853d33b7ceea61037a8ff4ddd9c79"} Feb 02 10:42:15 crc kubenswrapper[4845]: I0202 10:42:15.907986 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" event={"ID":"d289096b-a35d-4a41-90a3-cab735629cc7","Type":"ContainerStarted","Data":"a4328a34ee8d46edafc84e0b5ccff02a1fb90494cbe6f4f5b5d82b1e9a40b538"} Feb 02 10:42:15 crc kubenswrapper[4845]: I0202 10:42:15.934539 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch" podStartSLOduration=28.934519979 podStartE2EDuration="28.934519979s" podCreationTimestamp="2026-02-02 10:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:42:15.928273532 +0000 UTC m=+617.019674982" watchObservedRunningTime="2026-02-02 10:42:15.934519979 +0000 UTC m=+617.025921429" Feb 02 10:42:16 crc kubenswrapper[4845]: I0202 10:42:16.216429 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-5wvdz"] Feb 02 10:42:16 crc kubenswrapper[4845]: W0202 10:42:16.222226 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75686c5_933f_4f8d_bf87_0229795baf12.slice/crio-b21494f88ed0d601d2f5353d8f0b419da3243e78fbca3176e26c7eaa4a75afef WatchSource:0}: Error finding container b21494f88ed0d601d2f5353d8f0b419da3243e78fbca3176e26c7eaa4a75afef: Status 404 returned error can't find the container with id b21494f88ed0d601d2f5353d8f0b419da3243e78fbca3176e26c7eaa4a75afef Feb 02 10:42:16 crc kubenswrapper[4845]: I0202 10:42:16.915726 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" event={"ID":"b75686c5-933f-4f8d-bf87-0229795baf12","Type":"ContainerStarted","Data":"b21494f88ed0d601d2f5353d8f0b419da3243e78fbca3176e26c7eaa4a75afef"} Feb 02 10:42:17 crc kubenswrapper[4845]: I0202 10:42:17.712160 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:17 crc kubenswrapper[4845]: I0202 10:42:17.712896 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" Feb 02 10:42:17 crc kubenswrapper[4845]: I0202 10:42:17.922928 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" event={"ID":"1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec","Type":"ContainerStarted","Data":"634e46c3ef079bbd29e9dec1eb2c2e9c89dd65ebe6e39f1e7b62b411e2e279cd"} Feb 02 10:42:17 crc kubenswrapper[4845]: I0202 10:42:17.923097 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:17 crc kubenswrapper[4845]: I0202 10:42:17.940111 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" podStartSLOduration=27.519446516 podStartE2EDuration="30.940090915s" podCreationTimestamp="2026-02-02 10:41:47 +0000 UTC" firstStartedPulling="2026-02-02 10:42:13.896516232 +0000 UTC m=+614.987917682" lastFinishedPulling="2026-02-02 10:42:17.317160631 +0000 UTC m=+618.408562081" observedRunningTime="2026-02-02 10:42:17.937381988 +0000 UTC m=+619.028783468" watchObservedRunningTime="2026-02-02 10:42:17.940090915 +0000 UTC m=+619.031492365" Feb 02 10:42:18 crc kubenswrapper[4845]: I0202 10:42:18.139828 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27"] Feb 02 10:42:18 crc kubenswrapper[4845]: W0202 10:42:18.145380 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308cfce2_8d47_45e6_9153_a8cd92a8758b.slice/crio-edecbf343eb9762c250cc5d010b3077c47840c0e78d17d4f5bc99657ab64f394 WatchSource:0}: Error finding container edecbf343eb9762c250cc5d010b3077c47840c0e78d17d4f5bc99657ab64f394: Status 404 returned error can't find the container with id edecbf343eb9762c250cc5d010b3077c47840c0e78d17d4f5bc99657ab64f394 Feb 02 10:42:18 crc kubenswrapper[4845]: I0202 10:42:18.929121 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" event={"ID":"308cfce2-8d47-45e6-9153-a8cd92a8758b","Type":"ContainerStarted","Data":"edecbf343eb9762c250cc5d010b3077c47840c0e78d17d4f5bc99657ab64f394"} Feb 02 10:42:24 crc kubenswrapper[4845]: I0202 10:42:24.973662 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" event={"ID":"308cfce2-8d47-45e6-9153-a8cd92a8758b","Type":"ContainerStarted","Data":"05a63d0d9e826be721515bd8c6bac42820418d079e03689b9b9fd6013ab69b6d"} Feb 02 10:42:24 crc kubenswrapper[4845]: I0202 10:42:24.977449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" event={"ID":"b75686c5-933f-4f8d-bf87-0229795baf12","Type":"ContainerStarted","Data":"56aa998f079128273f0715901559e831d59a0a7f276b1a2e485b4905bf5366b6"} Feb 02 10:42:24 crc kubenswrapper[4845]: I0202 10:42:24.977682 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:25 crc kubenswrapper[4845]: I0202 10:42:25.000064 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-rmj27" podStartSLOduration=31.93609995 podStartE2EDuration="38.00003884s" podCreationTimestamp="2026-02-02 10:41:47 +0000 UTC" firstStartedPulling="2026-02-02 10:42:18.147583282 +0000 UTC m=+619.238984732" lastFinishedPulling="2026-02-02 10:42:24.211522162 +0000 UTC m=+625.302923622" observedRunningTime="2026-02-02 10:42:24.993065622 +0000 UTC m=+626.084467072" watchObservedRunningTime="2026-02-02 10:42:25.00003884 +0000 UTC m=+626.091440300" Feb 02 10:42:25 crc kubenswrapper[4845]: I0202 10:42:25.019642 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" podStartSLOduration=30.028415541 podStartE2EDuration="38.019624218s" podCreationTimestamp="2026-02-02 10:41:47 +0000 UTC" firstStartedPulling="2026-02-02 10:42:16.22399044 +0000 UTC m=+617.315391890" lastFinishedPulling="2026-02-02 10:42:24.215199117 +0000 UTC m=+625.306600567" observedRunningTime="2026-02-02 10:42:25.016099688 +0000 UTC m=+626.107501138" watchObservedRunningTime="2026-02-02 10:42:25.019624218 +0000 UTC m=+626.111025658" Feb 02 10:42:25 crc kubenswrapper[4845]: I0202 10:42:25.037175 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-5wvdz" Feb 02 10:42:28 crc kubenswrapper[4845]: I0202 10:42:28.085611 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-8hhqb" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.866674 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7p596"] Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.867798 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7p596" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.870223 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.870548 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qghql" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.871168 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.882580 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ltwq9"] Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.883572 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.886138 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-wt465" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.897432 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8"] Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.898482 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.902352 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7p596"] Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.903681 4845 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-k7zt6" Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.942287 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8"] Feb 02 10:42:32 crc kubenswrapper[4845]: I0202 10:42:32.953785 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ltwq9"] Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.001559 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z65q6\" (UniqueName: \"kubernetes.io/projected/c1996e72-3bd0-4770-9662-c0c1359d7a8b-kube-api-access-z65q6\") pod \"cert-manager-858654f9db-7p596\" (UID: \"c1996e72-3bd0-4770-9662-c0c1359d7a8b\") " pod="cert-manager/cert-manager-858654f9db-7p596" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.001631 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7tf\" (UniqueName: \"kubernetes.io/projected/7b6c985e-704e-4ff8-b668-d2f4cb218172-kube-api-access-kt7tf\") pod \"cert-manager-webhook-687f57d79b-ltwq9\" (UID: \"7b6c985e-704e-4ff8-b668-d2f4cb218172\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.001690 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55zjj\" (UniqueName: \"kubernetes.io/projected/8b99109d-f1ff-4d24-b08a-c317fffd456c-kube-api-access-55zjj\") pod \"cert-manager-cainjector-cf98fcc89-vqsd8\" (UID: \"8b99109d-f1ff-4d24-b08a-c317fffd456c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.103243 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7tf\" (UniqueName: \"kubernetes.io/projected/7b6c985e-704e-4ff8-b668-d2f4cb218172-kube-api-access-kt7tf\") pod \"cert-manager-webhook-687f57d79b-ltwq9\" (UID: \"7b6c985e-704e-4ff8-b668-d2f4cb218172\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.103327 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55zjj\" (UniqueName: \"kubernetes.io/projected/8b99109d-f1ff-4d24-b08a-c317fffd456c-kube-api-access-55zjj\") pod \"cert-manager-cainjector-cf98fcc89-vqsd8\" (UID: \"8b99109d-f1ff-4d24-b08a-c317fffd456c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.103361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z65q6\" (UniqueName: \"kubernetes.io/projected/c1996e72-3bd0-4770-9662-c0c1359d7a8b-kube-api-access-z65q6\") pod \"cert-manager-858654f9db-7p596\" (UID: \"c1996e72-3bd0-4770-9662-c0c1359d7a8b\") " pod="cert-manager/cert-manager-858654f9db-7p596" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.130796 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7tf\" (UniqueName: \"kubernetes.io/projected/7b6c985e-704e-4ff8-b668-d2f4cb218172-kube-api-access-kt7tf\") pod \"cert-manager-webhook-687f57d79b-ltwq9\" (UID: \"7b6c985e-704e-4ff8-b668-d2f4cb218172\") " pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.131477 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55zjj\" (UniqueName: \"kubernetes.io/projected/8b99109d-f1ff-4d24-b08a-c317fffd456c-kube-api-access-55zjj\") pod \"cert-manager-cainjector-cf98fcc89-vqsd8\" (UID: \"8b99109d-f1ff-4d24-b08a-c317fffd456c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.134542 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z65q6\" (UniqueName: \"kubernetes.io/projected/c1996e72-3bd0-4770-9662-c0c1359d7a8b-kube-api-access-z65q6\") pod \"cert-manager-858654f9db-7p596\" (UID: \"c1996e72-3bd0-4770-9662-c0c1359d7a8b\") " pod="cert-manager/cert-manager-858654f9db-7p596" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.190929 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7p596" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.220176 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.245819 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.735595 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-ltwq9"] Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.824395 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8"] Feb 02 10:42:33 crc kubenswrapper[4845]: W0202 10:42:33.824839 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1996e72_3bd0_4770_9662_c0c1359d7a8b.slice/crio-18c1cc05b7f4a2760b38006e10a59aab789c2cea129f9375b62fcdd3fdb7c587 WatchSource:0}: Error finding container 18c1cc05b7f4a2760b38006e10a59aab789c2cea129f9375b62fcdd3fdb7c587: Status 404 returned error can't find the container with id 18c1cc05b7f4a2760b38006e10a59aab789c2cea129f9375b62fcdd3fdb7c587 Feb 02 10:42:33 crc kubenswrapper[4845]: W0202 10:42:33.828480 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b99109d_f1ff_4d24_b08a_c317fffd456c.slice/crio-aae72d15b12e31cfe6bb5cd99710a1685183bf60aaeffbf924ff4755b050d2bf WatchSource:0}: Error finding container aae72d15b12e31cfe6bb5cd99710a1685183bf60aaeffbf924ff4755b050d2bf: Status 404 returned error can't find the container with id aae72d15b12e31cfe6bb5cd99710a1685183bf60aaeffbf924ff4755b050d2bf Feb 02 10:42:33 crc kubenswrapper[4845]: I0202 10:42:33.835999 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7p596"] Feb 02 10:42:34 crc kubenswrapper[4845]: I0202 10:42:34.039819 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" event={"ID":"8b99109d-f1ff-4d24-b08a-c317fffd456c","Type":"ContainerStarted","Data":"aae72d15b12e31cfe6bb5cd99710a1685183bf60aaeffbf924ff4755b050d2bf"} Feb 02 10:42:34 crc kubenswrapper[4845]: I0202 10:42:34.041109 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" event={"ID":"7b6c985e-704e-4ff8-b668-d2f4cb218172","Type":"ContainerStarted","Data":"d907f45c657f2727d993361770ae30682ba0ee0cff6f4f92419fcd1327262b86"} Feb 02 10:42:34 crc kubenswrapper[4845]: I0202 10:42:34.042209 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7p596" event={"ID":"c1996e72-3bd0-4770-9662-c0c1359d7a8b","Type":"ContainerStarted","Data":"18c1cc05b7f4a2760b38006e10a59aab789c2cea129f9375b62fcdd3fdb7c587"} Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.116914 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" event={"ID":"7b6c985e-704e-4ff8-b668-d2f4cb218172","Type":"ContainerStarted","Data":"1b2805dce98489f2e025deaffc94d28553151f450e828ce2cc5c7b43582f7319"} Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.117420 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.118262 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7p596" event={"ID":"c1996e72-3bd0-4770-9662-c0c1359d7a8b","Type":"ContainerStarted","Data":"81cdc807afca4e336148ed9ac4f8d23235229c93406e33df603e4ccd65ae95b0"} Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.119449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" event={"ID":"8b99109d-f1ff-4d24-b08a-c317fffd456c","Type":"ContainerStarted","Data":"389ec7a45d6ff07d80dc73e5713795b2d4188dd6b953053b78c77e0fc161b22f"} Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.152736 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" podStartSLOduration=2.035672149 podStartE2EDuration="12.152692014s" podCreationTimestamp="2026-02-02 10:42:32 +0000 UTC" firstStartedPulling="2026-02-02 10:42:33.752620141 +0000 UTC m=+634.844021591" lastFinishedPulling="2026-02-02 10:42:43.869640006 +0000 UTC m=+644.961041456" observedRunningTime="2026-02-02 10:42:44.143266586 +0000 UTC m=+645.234668046" watchObservedRunningTime="2026-02-02 10:42:44.152692014 +0000 UTC m=+645.244093464" Feb 02 10:42:44 crc kubenswrapper[4845]: I0202 10:42:44.175668 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7p596" podStartSLOduration=2.193571675 podStartE2EDuration="12.175650058s" podCreationTimestamp="2026-02-02 10:42:32 +0000 UTC" firstStartedPulling="2026-02-02 10:42:33.827000779 +0000 UTC m=+634.918402229" lastFinishedPulling="2026-02-02 10:42:43.809079162 +0000 UTC m=+644.900480612" observedRunningTime="2026-02-02 10:42:44.167792334 +0000 UTC m=+645.259193784" watchObservedRunningTime="2026-02-02 10:42:44.175650058 +0000 UTC m=+645.267051508" Feb 02 10:42:53 crc kubenswrapper[4845]: I0202 10:42:53.224484 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-ltwq9" Feb 02 10:42:53 crc kubenswrapper[4845]: I0202 10:42:53.247244 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vqsd8" podStartSLOduration=11.270911819 podStartE2EDuration="21.247214548s" podCreationTimestamp="2026-02-02 10:42:32 +0000 UTC" firstStartedPulling="2026-02-02 10:42:33.832775143 +0000 UTC m=+634.924176593" lastFinishedPulling="2026-02-02 10:42:43.809077882 +0000 UTC m=+644.900479322" observedRunningTime="2026-02-02 10:42:44.195943035 +0000 UTC m=+645.287344495" watchObservedRunningTime="2026-02-02 10:42:53.247214548 +0000 UTC m=+654.338616038" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.605207 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt"] Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.607619 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.609701 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.655970 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt"] Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.760615 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.760687 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.760802 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhjd\" (UniqueName: \"kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.862462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.862576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.862617 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhjd\" (UniqueName: \"kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.863557 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.863574 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.885308 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhjd\" (UniqueName: \"kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd\") pod \"40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:14 crc kubenswrapper[4845]: I0202 10:43:14.931178 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.050333 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz"] Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.051776 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.067120 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz"] Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.171604 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.171710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9n85\" (UniqueName: \"kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.171743 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.272946 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.273052 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9n85\" (UniqueName: \"kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.273078 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.273758 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.274165 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.289649 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9n85\" (UniqueName: \"kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85\") pod \"19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.369415 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.448514 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt"] Feb 02 10:43:15 crc kubenswrapper[4845]: W0202 10:43:15.460821 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ead5170_aa2d_4a22_a528_02edf1375239.slice/crio-56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2 WatchSource:0}: Error finding container 56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2: Status 404 returned error can't find the container with id 56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2 Feb 02 10:43:15 crc kubenswrapper[4845]: I0202 10:43:15.768594 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz"] Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.318873 4845 generic.go:334] "Generic (PLEG): container finished" podID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerID="08e15f1408179ded73501a7180ee37bd0122da7d1e775c76239ad3d338df7392" exitCode=0 Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.319052 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" event={"ID":"2e68fd72-b961-4a58-9f54-01bd2f6ebd76","Type":"ContainerDied","Data":"08e15f1408179ded73501a7180ee37bd0122da7d1e775c76239ad3d338df7392"} Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.319129 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" event={"ID":"2e68fd72-b961-4a58-9f54-01bd2f6ebd76","Type":"ContainerStarted","Data":"c328959b5704423d66f46d82cd926b32f32ec5337fd4df119b8682189b823366"} Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.327391 4845 generic.go:334] "Generic (PLEG): container finished" podID="8ead5170-aa2d-4a22-a528-02edf1375239" containerID="cf8d0bbf8e7ea2bdf2bafb3c41509dc4a4527f3b417b1f3f58101835dbfaab0f" exitCode=0 Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.327440 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" event={"ID":"8ead5170-aa2d-4a22-a528-02edf1375239","Type":"ContainerDied","Data":"cf8d0bbf8e7ea2bdf2bafb3c41509dc4a4527f3b417b1f3f58101835dbfaab0f"} Feb 02 10:43:16 crc kubenswrapper[4845]: I0202 10:43:16.327471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" event={"ID":"8ead5170-aa2d-4a22-a528-02edf1375239","Type":"ContainerStarted","Data":"56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2"} Feb 02 10:43:18 crc kubenswrapper[4845]: I0202 10:43:18.340322 4845 generic.go:334] "Generic (PLEG): container finished" podID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerID="8c67dbf7cfb27dcf9fd8534c99368255fab933e7e5f365f89afec67173dcc887" exitCode=0 Feb 02 10:43:18 crc kubenswrapper[4845]: I0202 10:43:18.340427 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" event={"ID":"2e68fd72-b961-4a58-9f54-01bd2f6ebd76","Type":"ContainerDied","Data":"8c67dbf7cfb27dcf9fd8534c99368255fab933e7e5f365f89afec67173dcc887"} Feb 02 10:43:18 crc kubenswrapper[4845]: I0202 10:43:18.343429 4845 generic.go:334] "Generic (PLEG): container finished" podID="8ead5170-aa2d-4a22-a528-02edf1375239" containerID="4f2619db0a67dd6fd30bb0e621c70150792e47b4de68ff5a9637eff88e5594ef" exitCode=0 Feb 02 10:43:18 crc kubenswrapper[4845]: I0202 10:43:18.343468 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" event={"ID":"8ead5170-aa2d-4a22-a528-02edf1375239","Type":"ContainerDied","Data":"4f2619db0a67dd6fd30bb0e621c70150792e47b4de68ff5a9637eff88e5594ef"} Feb 02 10:43:19 crc kubenswrapper[4845]: I0202 10:43:19.356777 4845 generic.go:334] "Generic (PLEG): container finished" podID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerID="17ba9db62c398a40cf3c0fe7cf89e3c388ae00f4c5d6cd57a195a580a2af57ce" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4845]: I0202 10:43:19.356840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" event={"ID":"2e68fd72-b961-4a58-9f54-01bd2f6ebd76","Type":"ContainerDied","Data":"17ba9db62c398a40cf3c0fe7cf89e3c388ae00f4c5d6cd57a195a580a2af57ce"} Feb 02 10:43:19 crc kubenswrapper[4845]: I0202 10:43:19.359254 4845 generic.go:334] "Generic (PLEG): container finished" podID="8ead5170-aa2d-4a22-a528-02edf1375239" containerID="0023190a8d25d3f86dc87050e5358fb264a399bdddaa2a897010bf724797d898" exitCode=0 Feb 02 10:43:19 crc kubenswrapper[4845]: I0202 10:43:19.359327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" event={"ID":"8ead5170-aa2d-4a22-a528-02edf1375239","Type":"ContainerDied","Data":"0023190a8d25d3f86dc87050e5358fb264a399bdddaa2a897010bf724797d898"} Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.660389 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.667993 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.771939 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle\") pod \"8ead5170-aa2d-4a22-a528-02edf1375239\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.771991 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9n85\" (UniqueName: \"kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85\") pod \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.772024 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle\") pod \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.772074 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util\") pod \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\" (UID: \"2e68fd72-b961-4a58-9f54-01bd2f6ebd76\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.772119 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhjd\" (UniqueName: \"kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd\") pod \"8ead5170-aa2d-4a22-a528-02edf1375239\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.772156 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util\") pod \"8ead5170-aa2d-4a22-a528-02edf1375239\" (UID: \"8ead5170-aa2d-4a22-a528-02edf1375239\") " Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.773172 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle" (OuterVolumeSpecName: "bundle") pod "8ead5170-aa2d-4a22-a528-02edf1375239" (UID: "8ead5170-aa2d-4a22-a528-02edf1375239"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.773170 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle" (OuterVolumeSpecName: "bundle") pod "2e68fd72-b961-4a58-9f54-01bd2f6ebd76" (UID: "2e68fd72-b961-4a58-9f54-01bd2f6ebd76"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.778125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd" (OuterVolumeSpecName: "kube-api-access-fnhjd") pod "8ead5170-aa2d-4a22-a528-02edf1375239" (UID: "8ead5170-aa2d-4a22-a528-02edf1375239"). InnerVolumeSpecName "kube-api-access-fnhjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.786169 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util" (OuterVolumeSpecName: "util") pod "2e68fd72-b961-4a58-9f54-01bd2f6ebd76" (UID: "2e68fd72-b961-4a58-9f54-01bd2f6ebd76"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.786190 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85" (OuterVolumeSpecName: "kube-api-access-h9n85") pod "2e68fd72-b961-4a58-9f54-01bd2f6ebd76" (UID: "2e68fd72-b961-4a58-9f54-01bd2f6ebd76"). InnerVolumeSpecName "kube-api-access-h9n85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.787720 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util" (OuterVolumeSpecName: "util") pod "8ead5170-aa2d-4a22-a528-02edf1375239" (UID: "8ead5170-aa2d-4a22-a528-02edf1375239"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.874755 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9n85\" (UniqueName: \"kubernetes.io/projected/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-kube-api-access-h9n85\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.874827 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.874859 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e68fd72-b961-4a58-9f54-01bd2f6ebd76-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.874973 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnhjd\" (UniqueName: \"kubernetes.io/projected/8ead5170-aa2d-4a22-a528-02edf1375239-kube-api-access-fnhjd\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.874997 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:20 crc kubenswrapper[4845]: I0202 10:43:20.875018 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ead5170-aa2d-4a22-a528-02edf1375239-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.378234 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" event={"ID":"2e68fd72-b961-4a58-9f54-01bd2f6ebd76","Type":"ContainerDied","Data":"c328959b5704423d66f46d82cd926b32f32ec5337fd4df119b8682189b823366"} Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.378266 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz" Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.378284 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c328959b5704423d66f46d82cd926b32f32ec5337fd4df119b8682189b823366" Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.380406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" event={"ID":"8ead5170-aa2d-4a22-a528-02edf1375239","Type":"ContainerDied","Data":"56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2"} Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.380446 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56692ba5dca356c3a1fa0771540cf593326a8fa372bed74e1f1a7c3ae3139db2" Feb 02 10:43:21 crc kubenswrapper[4845]: I0202 10:43:21.380488 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.459839 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b"] Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460632 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="util" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460644 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="util" Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460654 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460660 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460669 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460674 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460686 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="pull" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460713 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="pull" Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460725 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="pull" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460731 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="pull" Feb 02 10:43:31 crc kubenswrapper[4845]: E0202 10:43:31.460744 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="util" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460749 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="util" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460849 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ead5170-aa2d-4a22-a528-02edf1375239" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.460865 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e68fd72-b961-4a58-9f54-01bd2f6ebd76" containerName="extract" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.461600 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.466535 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-5kb9h" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.467326 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.467546 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.467547 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.467948 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.469008 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.487097 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b"] Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.523405 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-manager-config\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.523489 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-apiservice-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.523517 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.523535 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-webhook-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.523553 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbxnt\" (UniqueName: \"kubernetes.io/projected/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-kube-api-access-bbxnt\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.624462 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-manager-config\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.624553 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-apiservice-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.624584 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.624607 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-webhook-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.624634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbxnt\" (UniqueName: \"kubernetes.io/projected/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-kube-api-access-bbxnt\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.625922 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-manager-config\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.630545 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.633653 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-apiservice-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.634599 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-webhook-cert\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.652529 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbxnt\" (UniqueName: \"kubernetes.io/projected/2988a5fa-2703-4a60-bcd6-dc81ceea7e1a-kube-api-access-bbxnt\") pod \"loki-operator-controller-manager-b659b8cd7-mwl8b\" (UID: \"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a\") " pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:31 crc kubenswrapper[4845]: I0202 10:43:31.778400 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:32 crc kubenswrapper[4845]: I0202 10:43:32.227542 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b"] Feb 02 10:43:32 crc kubenswrapper[4845]: W0202 10:43:32.230090 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2988a5fa_2703_4a60_bcd6_dc81ceea7e1a.slice/crio-5bfd8190657502ce3817fad07a8e68172ebc266a00f608242c592ff4dd86777b WatchSource:0}: Error finding container 5bfd8190657502ce3817fad07a8e68172ebc266a00f608242c592ff4dd86777b: Status 404 returned error can't find the container with id 5bfd8190657502ce3817fad07a8e68172ebc266a00f608242c592ff4dd86777b Feb 02 10:43:32 crc kubenswrapper[4845]: I0202 10:43:32.444595 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" event={"ID":"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a","Type":"ContainerStarted","Data":"5bfd8190657502ce3817fad07a8e68172ebc266a00f608242c592ff4dd86777b"} Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.048990 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6"] Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.050871 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.053455 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.053814 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-2jhc9" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.054401 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.071356 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6"] Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.198038 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5dd\" (UniqueName: \"kubernetes.io/projected/cb944758-f09b-4486-9f3b-4ef87b53246b-kube-api-access-jj5dd\") pod \"cluster-logging-operator-79cf69ddc8-4pzr6\" (UID: \"cb944758-f09b-4486-9f3b-4ef87b53246b\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.299986 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5dd\" (UniqueName: \"kubernetes.io/projected/cb944758-f09b-4486-9f3b-4ef87b53246b-kube-api-access-jj5dd\") pod \"cluster-logging-operator-79cf69ddc8-4pzr6\" (UID: \"cb944758-f09b-4486-9f3b-4ef87b53246b\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.330143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5dd\" (UniqueName: \"kubernetes.io/projected/cb944758-f09b-4486-9f3b-4ef87b53246b-kube-api-access-jj5dd\") pod \"cluster-logging-operator-79cf69ddc8-4pzr6\" (UID: \"cb944758-f09b-4486-9f3b-4ef87b53246b\") " pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" Feb 02 10:43:36 crc kubenswrapper[4845]: I0202 10:43:36.375975 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" Feb 02 10:43:37 crc kubenswrapper[4845]: I0202 10:43:37.373911 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6"] Feb 02 10:43:38 crc kubenswrapper[4845]: I0202 10:43:38.483337 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" event={"ID":"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a","Type":"ContainerStarted","Data":"ed3d3cae21e8dac792a54567d397630f3cd37f4717a98acd18cf5208104346c8"} Feb 02 10:43:38 crc kubenswrapper[4845]: I0202 10:43:38.484557 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" event={"ID":"cb944758-f09b-4486-9f3b-4ef87b53246b","Type":"ContainerStarted","Data":"4aea2474f36641544824e3dac558a997622c039c094b9272ee1c89b62342d8e9"} Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.237756 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.238349 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.549597 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" event={"ID":"2988a5fa-2703-4a60-bcd6-dc81ceea7e1a","Type":"ContainerStarted","Data":"55f26179f3e73506c8839663cd83b18c32b1d5b7dbc19547af5f55398856c41b"} Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.549957 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.551830 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.552741 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" event={"ID":"cb944758-f09b-4486-9f3b-4ef87b53246b","Type":"ContainerStarted","Data":"1eec1502739bdc2565db67833ef15c1010efda6659286317c92d7e6c512ff7fd"} Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.581093 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-b659b8cd7-mwl8b" podStartSLOduration=1.648783991 podStartE2EDuration="15.58107582s" podCreationTimestamp="2026-02-02 10:43:31 +0000 UTC" firstStartedPulling="2026-02-02 10:43:32.232984125 +0000 UTC m=+693.324385575" lastFinishedPulling="2026-02-02 10:43:46.165275954 +0000 UTC m=+707.256677404" observedRunningTime="2026-02-02 10:43:46.572660056 +0000 UTC m=+707.664061506" watchObservedRunningTime="2026-02-02 10:43:46.58107582 +0000 UTC m=+707.672477260" Feb 02 10:43:46 crc kubenswrapper[4845]: I0202 10:43:46.619421 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-79cf69ddc8-4pzr6" podStartSLOduration=2.104720488 podStartE2EDuration="10.619405161s" podCreationTimestamp="2026-02-02 10:43:36 +0000 UTC" firstStartedPulling="2026-02-02 10:43:37.633344271 +0000 UTC m=+698.724745721" lastFinishedPulling="2026-02-02 10:43:46.148028944 +0000 UTC m=+707.239430394" observedRunningTime="2026-02-02 10:43:46.615446416 +0000 UTC m=+707.706847866" watchObservedRunningTime="2026-02-02 10:43:46.619405161 +0000 UTC m=+707.710806611" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.657029 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.662126 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.664813 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.664961 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.664993 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.750023 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-66f043a3-08d0-477d-b973-942443398b84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66f043a3-08d0-477d-b973-942443398b84\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.750193 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6kjf\" (UniqueName: \"kubernetes.io/projected/09851333-f877-4094-8451-908fa1abc4a9-kube-api-access-m6kjf\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.851732 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6kjf\" (UniqueName: \"kubernetes.io/projected/09851333-f877-4094-8451-908fa1abc4a9-kube-api-access-m6kjf\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.851810 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-66f043a3-08d0-477d-b973-942443398b84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66f043a3-08d0-477d-b973-942443398b84\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.858976 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.859037 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-66f043a3-08d0-477d-b973-942443398b84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66f043a3-08d0-477d-b973-942443398b84\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4b1d0a84b00582161558387a47d62aa534a513d4618a73aac4f7aa998f727f17/globalmount\"" pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.871847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6kjf\" (UniqueName: \"kubernetes.io/projected/09851333-f877-4094-8451-908fa1abc4a9-kube-api-access-m6kjf\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.889840 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-66f043a3-08d0-477d-b973-942443398b84\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-66f043a3-08d0-477d-b973-942443398b84\") pod \"minio\" (UID: \"09851333-f877-4094-8451-908fa1abc4a9\") " pod="minio-dev/minio" Feb 02 10:43:51 crc kubenswrapper[4845]: I0202 10:43:51.992488 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 02 10:43:52 crc kubenswrapper[4845]: I0202 10:43:52.434158 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 02 10:43:52 crc kubenswrapper[4845]: I0202 10:43:52.593350 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"09851333-f877-4094-8451-908fa1abc4a9","Type":"ContainerStarted","Data":"279fa9017fa643cb20d1bc76003809b3a8bf0d8f551ba032000407ac89e11ce2"} Feb 02 10:43:57 crc kubenswrapper[4845]: I0202 10:43:57.625264 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"09851333-f877-4094-8451-908fa1abc4a9","Type":"ContainerStarted","Data":"870a27a0492f8d5f263f5d7470212eaf1370d2d4884793c941c7334268621208"} Feb 02 10:43:57 crc kubenswrapper[4845]: I0202 10:43:57.647764 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.499983192 podStartE2EDuration="8.647744099s" podCreationTimestamp="2026-02-02 10:43:49 +0000 UTC" firstStartedPulling="2026-02-02 10:43:52.442283368 +0000 UTC m=+713.533684818" lastFinishedPulling="2026-02-02 10:43:56.590044265 +0000 UTC m=+717.681445725" observedRunningTime="2026-02-02 10:43:57.642809357 +0000 UTC m=+718.734210797" watchObservedRunningTime="2026-02-02 10:43:57.647744099 +0000 UTC m=+718.739145559" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.332822 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-847z7"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.335616 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.341560 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.341874 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-nt9sz" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.341944 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.341672 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.342217 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.352219 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-847z7"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.420652 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.420733 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnv4\" (UniqueName: \"kubernetes.io/projected/4af06166-f541-44e7-8b4b-37e4f39a8729-kube-api-access-vnnv4\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.420759 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-config\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.420791 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.420827 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.440654 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-sbp94"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.441586 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.444941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.445114 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.445265 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.467056 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-sbp94"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528184 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl4zb\" (UniqueName: \"kubernetes.io/projected/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-kube-api-access-bl4zb\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528238 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnv4\" (UniqueName: \"kubernetes.io/projected/4af06166-f541-44e7-8b4b-37e4f39a8729-kube-api-access-vnnv4\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528263 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-config\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528309 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528333 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528358 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528378 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-config\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528420 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528447 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528514 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-s3\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.528534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.530000 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-config\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.530924 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.535041 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.537009 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.541394 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.541757 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.542353 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-http\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.544746 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/4af06166-f541-44e7-8b4b-37e4f39a8729-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.556328 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.572934 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnv4\" (UniqueName: \"kubernetes.io/projected/4af06166-f541-44e7-8b4b-37e4f39a8729-kube-api-access-vnnv4\") pod \"logging-loki-distributor-5f678c8dd6-847z7\" (UID: \"4af06166-f541-44e7-8b4b-37e4f39a8729\") " pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.630844 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.630964 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.631047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-config\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.631115 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-s3\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.631146 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632186 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl4zb\" (UniqueName: \"kubernetes.io/projected/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-kube-api-access-bl4zb\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632243 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632287 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632323 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvtks\" (UniqueName: \"kubernetes.io/projected/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-kube-api-access-qvtks\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632398 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.632449 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-config\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.634073 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-config\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.634795 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-ca-bundle\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.650315 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.677609 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-grpc\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.678490 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-s3\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.679266 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-logging-loki-querier-http\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.679836 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.685816 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.693113 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.693321 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.693439 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.694495 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.694539 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.694618 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.697182 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.701045 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl4zb\" (UniqueName: \"kubernetes.io/projected/796c275c-0c9b-4b2e-ba0f-7fbeb645028a-kube-api-access-bl4zb\") pod \"logging-loki-querier-76788598db-sbp94\" (UID: \"796c275c-0c9b-4b2e-ba0f-7fbeb645028a\") " pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.701078 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-48tlt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.712008 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.719052 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt"] Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.733781 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.734086 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.734195 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-rbac\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.734987 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735098 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tenants\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735187 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735286 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqt2q\" (UniqueName: \"kubernetes.io/projected/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-kube-api-access-wqt2q\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735387 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735490 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735601 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw5l4\" (UniqueName: \"kubernetes.io/projected/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-kube-api-access-gw5l4\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735698 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735794 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.735909 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tenants\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736129 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736247 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-config\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736339 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736455 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-rbac\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736733 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.736874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvtks\" (UniqueName: \"kubernetes.io/projected/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-kube-api-access-qvtks\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.740749 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.741798 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-config\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.742074 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.751356 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.758860 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvtks\" (UniqueName: \"kubernetes.io/projected/27a684fe-6402-4a0d-ab7c-e5c4eab14a64-kube-api-access-qvtks\") pod \"logging-loki-query-frontend-69d9546745-dz4l8\" (UID: \"27a684fe-6402-4a0d-ab7c-e5c4eab14a64\") " pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.761762 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839617 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw5l4\" (UniqueName: \"kubernetes.io/projected/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-kube-api-access-gw5l4\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839664 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839710 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tenants\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839742 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839827 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-rbac\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.839871 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840032 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840052 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-rbac\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840117 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tenants\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840161 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqt2q\" (UniqueName: \"kubernetes.io/projected/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-kube-api-access-wqt2q\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840202 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.840223 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.841137 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.842378 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: E0202 10:44:02.843090 4845 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Feb 02 10:44:02 crc kubenswrapper[4845]: E0202 10:44:02.843147 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret podName:2b18d0a9-d2cc-4d0b-9ede-a78da13ac929 nodeName:}" failed. No retries permitted until 2026-02-02 10:44:03.343129245 +0000 UTC m=+724.434530765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret") pod "logging-loki-gateway-cf45dcc8c-vr5gw" (UID: "2b18d0a9-d2cc-4d0b-9ede-a78da13ac929") : secret "logging-loki-gateway-http" not found Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.843369 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.844878 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-lokistack-gateway\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: E0202 10:44:02.844989 4845 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Feb 02 10:44:02 crc kubenswrapper[4845]: E0202 10:44:02.845053 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret podName:1a4ec7d2-3bae-4f70-9a46-e90b067a0518 nodeName:}" failed. No retries permitted until 2026-02-02 10:44:03.3450201 +0000 UTC m=+724.436421550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret") pod "logging-loki-gateway-cf45dcc8c-wn9nt" (UID: "1a4ec7d2-3bae-4f70-9a46-e90b067a0518") : secret "logging-loki-gateway-http" not found Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.845346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-rbac\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.845369 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tenants\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.846295 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.846465 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-rbac\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.847615 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.849771 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tenants\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.850782 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.864547 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.866841 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqt2q\" (UniqueName: \"kubernetes.io/projected/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-kube-api-access-wqt2q\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.869747 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw5l4\" (UniqueName: \"kubernetes.io/projected/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-kube-api-access-gw5l4\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:02 crc kubenswrapper[4845]: I0202 10:44:02.983608 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.204074 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76788598db-sbp94"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.274483 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5f678c8dd6-847z7"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.378119 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.378182 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.384646 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/1a4ec7d2-3bae-4f70-9a46-e90b067a0518-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-wn9nt\" (UID: \"1a4ec7d2-3bae-4f70-9a46-e90b067a0518\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.386142 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/2b18d0a9-d2cc-4d0b-9ede-a78da13ac929-tls-secret\") pod \"logging-loki-gateway-cf45dcc8c-vr5gw\" (UID: \"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929\") " pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.442112 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.444694 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.448240 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.448374 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.467300 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.484192 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8"] Feb 02 10:44:03 crc kubenswrapper[4845]: W0202 10:44:03.487105 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27a684fe_6402_4a0d_ab7c_e5c4eab14a64.slice/crio-9de01e1076e7dcb39b360a42f3a67c269f397ebd39929f72b5292e9adab0394c WatchSource:0}: Error finding container 9de01e1076e7dcb39b360a42f3a67c269f397ebd39929f72b5292e9adab0394c: Status 404 returned error can't find the container with id 9de01e1076e7dcb39b360a42f3a67c269f397ebd39929f72b5292e9adab0394c Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.514625 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.515782 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.517588 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.520865 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.530184 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580779 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580845 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580875 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580923 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580950 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580968 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.580996 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-config\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581017 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581036 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581093 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581131 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581159 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5q9\" (UniqueName: \"kubernetes.io/projected/2d889e99-8118-4f52-ab20-b69a55bec079-kube-api-access-6x5q9\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581200 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwqm\" (UniqueName: \"kubernetes.io/projected/1f889290-f739-444c-a278-254f68d9d886-kube-api-access-jlwqm\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.581219 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-config\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.584996 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.585965 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.588046 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.589105 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.599811 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.606983 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.640879 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682368 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8dt\" (UniqueName: \"kubernetes.io/projected/10b5b71f-47de-4ca2-9133-254552173c73-kube-api-access-nh8dt\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682428 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682466 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682488 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682506 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682531 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682551 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5q9\" (UniqueName: \"kubernetes.io/projected/2d889e99-8118-4f52-ab20-b69a55bec079-kube-api-access-6x5q9\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682569 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682595 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682614 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwqm\" (UniqueName: \"kubernetes.io/projected/1f889290-f739-444c-a278-254f68d9d886-kube-api-access-jlwqm\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682634 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682659 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682676 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682696 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682728 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682751 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-config\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682771 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682796 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-config\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682829 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-config\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682851 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.682868 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.683613 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.684955 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.688140 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.689697 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f889290-f739-444c-a278-254f68d9d886-config\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.691371 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.692418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d889e99-8118-4f52-ab20-b69a55bec079-config\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.693436 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.695016 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.695040 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a953406181df73c1ca047b8acbd804f2e868561bc8ffa02202afc7ae6c7ed2a8/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.698186 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.698208 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/31a7571ca6c4152e78090cb6fe4ef838eea0184dcab8df284fca036afc1747d3/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.699635 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.711860 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5q9\" (UniqueName: \"kubernetes.io/projected/2d889e99-8118-4f52-ab20-b69a55bec079-kube-api-access-6x5q9\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.713369 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/1f889290-f739-444c-a278-254f68d9d886-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.714170 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.714199 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b46ddeb29837386912cb1ef2e6544d28525a72d4688d1a2a19b2c6658c304d02/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.714287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2d889e99-8118-4f52-ab20-b69a55bec079-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.722502 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwqm\" (UniqueName: \"kubernetes.io/projected/1f889290-f739-444c-a278-254f68d9d886-kube-api-access-jlwqm\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.725983 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" event={"ID":"27a684fe-6402-4a0d-ab7c-e5c4eab14a64","Type":"ContainerStarted","Data":"9de01e1076e7dcb39b360a42f3a67c269f397ebd39929f72b5292e9adab0394c"} Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.726025 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" event={"ID":"796c275c-0c9b-4b2e-ba0f-7fbeb645028a","Type":"ContainerStarted","Data":"cbc4209c03b20422f2d8c6ab4573e6bf63c5b82f6778343a4460ff443dba53ba"} Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.727169 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" event={"ID":"4af06166-f541-44e7-8b4b-37e4f39a8729","Type":"ContainerStarted","Data":"7671610f17bfc58e7d18806a1287ca970ef8eecb05b3291f7fd7dce378ce7247"} Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.737095 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-722bd17b-12e2-4e74-a109-fc7e8a9dd2f6\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.748621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2753f72d-7aa0-4324-baa7-579d7b46eb15\") pod \"logging-loki-ingester-0\" (UID: \"2d889e99-8118-4f52-ab20-b69a55bec079\") " pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.768794 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-365e64a3-5a62-4e52-986f-ab8e64490fb9\") pod \"logging-loki-compactor-0\" (UID: \"1f889290-f739-444c-a278-254f68d9d886\") " pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.769453 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784403 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784456 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8dt\" (UniqueName: \"kubernetes.io/projected/10b5b71f-47de-4ca2-9133-254552173c73-kube-api-access-nh8dt\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784483 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784537 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784586 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784617 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.784678 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-config\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.786434 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-config\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.788007 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.790144 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.790176 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f4d77672c07f1c6dc1c2107fe162f508c8a1c0da8ffd3b4ec81f8850e0496143/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.790446 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.790666 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.791716 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/10b5b71f-47de-4ca2-9133-254552173c73-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.804807 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8dt\" (UniqueName: \"kubernetes.io/projected/10b5b71f-47de-4ca2-9133-254552173c73-kube-api-access-nh8dt\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.819722 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8e49ca23-3578-47d2-a1ae-ee6ec97e5c08\") pod \"logging-loki-index-gateway-0\" (UID: \"10b5b71f-47de-4ca2-9133-254552173c73\") " pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.839100 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:03 crc kubenswrapper[4845]: I0202 10:44:03.907220 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.047635 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw"] Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.118255 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt"] Feb 02 10:44:04 crc kubenswrapper[4845]: W0202 10:44:04.120984 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a4ec7d2_3bae_4f70_9a46_e90b067a0518.slice/crio-4a8edaf48ab7d00ef01502d4b29ee2f1a98c351828e1a83aba3c4328a0965efc WatchSource:0}: Error finding container 4a8edaf48ab7d00ef01502d4b29ee2f1a98c351828e1a83aba3c4328a0965efc: Status 404 returned error can't find the container with id 4a8edaf48ab7d00ef01502d4b29ee2f1a98c351828e1a83aba3c4328a0965efc Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.255113 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Feb 02 10:44:04 crc kubenswrapper[4845]: W0202 10:44:04.269948 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d889e99_8118_4f52_ab20_b69a55bec079.slice/crio-8dc65cd44955b04d286338f1906f764c6cc0cf521b0af541aee00492c63ce25f WatchSource:0}: Error finding container 8dc65cd44955b04d286338f1906f764c6cc0cf521b0af541aee00492c63ce25f: Status 404 returned error can't find the container with id 8dc65cd44955b04d286338f1906f764c6cc0cf521b0af541aee00492c63ce25f Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.327194 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Feb 02 10:44:04 crc kubenswrapper[4845]: W0202 10:44:04.328279 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f889290_f739_444c_a278_254f68d9d886.slice/crio-2eb128eb19f75ed3af633bf1d9a28ba239efd309f0949b4fc12d102cd18b623b WatchSource:0}: Error finding container 2eb128eb19f75ed3af633bf1d9a28ba239efd309f0949b4fc12d102cd18b623b: Status 404 returned error can't find the container with id 2eb128eb19f75ed3af633bf1d9a28ba239efd309f0949b4fc12d102cd18b623b Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.388844 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Feb 02 10:44:04 crc kubenswrapper[4845]: W0202 10:44:04.389251 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b5b71f_47de_4ca2_9133_254552173c73.slice/crio-6621bab08e80f97a2ab7a39da0a2d0718bccc2e28df57c5c270af058d1867031 WatchSource:0}: Error finding container 6621bab08e80f97a2ab7a39da0a2d0718bccc2e28df57c5c270af058d1867031: Status 404 returned error can't find the container with id 6621bab08e80f97a2ab7a39da0a2d0718bccc2e28df57c5c270af058d1867031 Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.733863 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"1f889290-f739-444c-a278-254f68d9d886","Type":"ContainerStarted","Data":"2eb128eb19f75ed3af633bf1d9a28ba239efd309f0949b4fc12d102cd18b623b"} Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.735242 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2d889e99-8118-4f52-ab20-b69a55bec079","Type":"ContainerStarted","Data":"8dc65cd44955b04d286338f1906f764c6cc0cf521b0af541aee00492c63ce25f"} Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.736378 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" event={"ID":"1a4ec7d2-3bae-4f70-9a46-e90b067a0518","Type":"ContainerStarted","Data":"4a8edaf48ab7d00ef01502d4b29ee2f1a98c351828e1a83aba3c4328a0965efc"} Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.737571 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"10b5b71f-47de-4ca2-9133-254552173c73","Type":"ContainerStarted","Data":"6621bab08e80f97a2ab7a39da0a2d0718bccc2e28df57c5c270af058d1867031"} Feb 02 10:44:04 crc kubenswrapper[4845]: I0202 10:44:04.739106 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" event={"ID":"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929","Type":"ContainerStarted","Data":"6664fdb51e9f6a8b391911c6c7d62987a130b0a6db4a43a9ecda5af88cb31a75"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.796809 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"10b5b71f-47de-4ca2-9133-254552173c73","Type":"ContainerStarted","Data":"78687d78e3128838ea4b33d9d6c0480812586a8dda1fb0e47f202f6181ff4482"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.798098 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.799623 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" event={"ID":"27a684fe-6402-4a0d-ab7c-e5c4eab14a64","Type":"ContainerStarted","Data":"9e9cf2537eb999394a49df9adfcbd67effcac750eebcfc8159d0ed7634816b11"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.800029 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.801413 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" event={"ID":"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929","Type":"ContainerStarted","Data":"f25968fd3ac0ec38b9e0068b09c40f923f915c065d0ceb0b769995a2864f808b"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.802974 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" event={"ID":"796c275c-0c9b-4b2e-ba0f-7fbeb645028a","Type":"ContainerStarted","Data":"b294f02fdcea07f8d56ef83c4d09d05f3b673932a5dc905ba73b25323a1cf900"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.803501 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.805254 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" event={"ID":"4af06166-f541-44e7-8b4b-37e4f39a8729","Type":"ContainerStarted","Data":"8c3cf11d52f709e9ffb9c631880d793d912342bc18f98575e29699dbd166a68e"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.805612 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.807637 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"1f889290-f739-444c-a278-254f68d9d886","Type":"ContainerStarted","Data":"e4773bff38cfe56687cab44f15a138662c8df95a8ab1fe431416a4e630a3d43c"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.808044 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.810743 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2d889e99-8118-4f52-ab20-b69a55bec079","Type":"ContainerStarted","Data":"268b8309eaae83fd84768af1f8ff9df6c24ac610dcb9cb3d979577890a74fb11"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.811219 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.813186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" event={"ID":"1a4ec7d2-3bae-4f70-9a46-e90b067a0518","Type":"ContainerStarted","Data":"b79cc812663c2471f6b6c4042318a01a929fbf597d53f35745d47d6dfeb5bce6"} Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.822426 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.44483239 podStartE2EDuration="6.822402577s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:04.391334568 +0000 UTC m=+725.482736018" lastFinishedPulling="2026-02-02 10:44:07.768904755 +0000 UTC m=+728.860306205" observedRunningTime="2026-02-02 10:44:08.822097078 +0000 UTC m=+729.913498548" watchObservedRunningTime="2026-02-02 10:44:08.822402577 +0000 UTC m=+729.913804027" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.846706 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" podStartSLOduration=2.254099474 podStartE2EDuration="6.846687279s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:03.249092082 +0000 UTC m=+724.340493522" lastFinishedPulling="2026-02-02 10:44:07.841679877 +0000 UTC m=+728.933081327" observedRunningTime="2026-02-02 10:44:08.839616824 +0000 UTC m=+729.931018274" watchObservedRunningTime="2026-02-02 10:44:08.846687279 +0000 UTC m=+729.938088729" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.871261 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.345944374 podStartE2EDuration="6.871237248s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:04.330418729 +0000 UTC m=+725.421820179" lastFinishedPulling="2026-02-02 10:44:07.855711603 +0000 UTC m=+728.947113053" observedRunningTime="2026-02-02 10:44:08.861387193 +0000 UTC m=+729.952788643" watchObservedRunningTime="2026-02-02 10:44:08.871237248 +0000 UTC m=+729.962638698" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.892306 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" podStartSLOduration=2.352461276 podStartE2EDuration="6.892288306s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:03.288595214 +0000 UTC m=+724.379996664" lastFinishedPulling="2026-02-02 10:44:07.828422244 +0000 UTC m=+728.919823694" observedRunningTime="2026-02-02 10:44:08.888838916 +0000 UTC m=+729.980240366" watchObservedRunningTime="2026-02-02 10:44:08.892288306 +0000 UTC m=+729.983689756" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.915166 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" podStartSLOduration=2.585845027 podStartE2EDuration="6.915148316s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:03.488757446 +0000 UTC m=+724.580158896" lastFinishedPulling="2026-02-02 10:44:07.818060735 +0000 UTC m=+728.909462185" observedRunningTime="2026-02-02 10:44:08.908981558 +0000 UTC m=+730.000383018" watchObservedRunningTime="2026-02-02 10:44:08.915148316 +0000 UTC m=+730.006549766" Feb 02 10:44:08 crc kubenswrapper[4845]: I0202 10:44:08.930171 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.362876124 podStartE2EDuration="6.93015154s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:04.272681921 +0000 UTC m=+725.364083381" lastFinishedPulling="2026-02-02 10:44:07.839957347 +0000 UTC m=+728.931358797" observedRunningTime="2026-02-02 10:44:08.925047022 +0000 UTC m=+730.016448472" watchObservedRunningTime="2026-02-02 10:44:08.93015154 +0000 UTC m=+730.021552990" Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.828008 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" event={"ID":"2b18d0a9-d2cc-4d0b-9ede-a78da13ac929","Type":"ContainerStarted","Data":"0ae8b619b97254e1988c8e40e4fe484e25c2f48bb2c59373b65e42fb11d7ee91"} Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.828802 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.830449 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" event={"ID":"1a4ec7d2-3bae-4f70-9a46-e90b067a0518","Type":"ContainerStarted","Data":"e28f81a10502d763f12a8c74c9cd0fcef2297c34733c92185f483974f167657d"} Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.837766 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.853099 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" podStartSLOduration=2.699734746 podStartE2EDuration="8.853076236s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:04.066276668 +0000 UTC m=+725.157678118" lastFinishedPulling="2026-02-02 10:44:10.219618158 +0000 UTC m=+731.311019608" observedRunningTime="2026-02-02 10:44:10.847748933 +0000 UTC m=+731.939150373" watchObservedRunningTime="2026-02-02 10:44:10.853076236 +0000 UTC m=+731.944477686" Feb 02 10:44:10 crc kubenswrapper[4845]: I0202 10:44:10.870397 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" podStartSLOduration=2.778160792 podStartE2EDuration="8.870379096s" podCreationTimestamp="2026-02-02 10:44:02 +0000 UTC" firstStartedPulling="2026-02-02 10:44:04.123905823 +0000 UTC m=+725.215307273" lastFinishedPulling="2026-02-02 10:44:10.216124127 +0000 UTC m=+731.307525577" observedRunningTime="2026-02-02 10:44:10.86530193 +0000 UTC m=+731.956703380" watchObservedRunningTime="2026-02-02 10:44:10.870379096 +0000 UTC m=+731.961780546" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.836685 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.837153 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.837195 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.845661 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.847997 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-vr5gw" Feb 02 10:44:11 crc kubenswrapper[4845]: I0202 10:44:11.848709 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-cf45dcc8c-wn9nt" Feb 02 10:44:16 crc kubenswrapper[4845]: I0202 10:44:16.237318 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:44:16 crc kubenswrapper[4845]: I0202 10:44:16.237388 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:44:23 crc kubenswrapper[4845]: I0202 10:44:23.776019 4845 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Feb 02 10:44:23 crc kubenswrapper[4845]: I0202 10:44:23.777651 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2d889e99-8118-4f52-ab20-b69a55bec079" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:44:23 crc kubenswrapper[4845]: I0202 10:44:23.846768 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Feb 02 10:44:23 crc kubenswrapper[4845]: I0202 10:44:23.914161 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Feb 02 10:44:32 crc kubenswrapper[4845]: I0202 10:44:32.690260 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5f678c8dd6-847z7" Feb 02 10:44:32 crc kubenswrapper[4845]: I0202 10:44:32.767327 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76788598db-sbp94" Feb 02 10:44:32 crc kubenswrapper[4845]: I0202 10:44:32.994166 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-69d9546745-dz4l8" Feb 02 10:44:33 crc kubenswrapper[4845]: I0202 10:44:33.780252 4845 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Feb 02 10:44:33 crc kubenswrapper[4845]: I0202 10:44:33.780319 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2d889e99-8118-4f52-ab20-b69a55bec079" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:44:34 crc kubenswrapper[4845]: I0202 10:44:34.072840 4845 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:44:43 crc kubenswrapper[4845]: I0202 10:44:43.775221 4845 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 02 10:44:43 crc kubenswrapper[4845]: I0202 10:44:43.775772 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2d889e99-8118-4f52-ab20-b69a55bec079" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:44:46 crc kubenswrapper[4845]: I0202 10:44:46.237820 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:44:46 crc kubenswrapper[4845]: I0202 10:44:46.238184 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:44:46 crc kubenswrapper[4845]: I0202 10:44:46.238230 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:44:46 crc kubenswrapper[4845]: I0202 10:44:46.238830 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:44:46 crc kubenswrapper[4845]: I0202 10:44:46.238873 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8" gracePeriod=600 Feb 02 10:44:47 crc kubenswrapper[4845]: I0202 10:44:47.112720 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8" exitCode=0 Feb 02 10:44:47 crc kubenswrapper[4845]: I0202 10:44:47.112781 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8"} Feb 02 10:44:47 crc kubenswrapper[4845]: I0202 10:44:47.113418 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd"} Feb 02 10:44:47 crc kubenswrapper[4845]: I0202 10:44:47.113455 4845 scope.go:117] "RemoveContainer" containerID="511b5a9de737657a9a1ff84c736b95abf52206e96ffdc8cf5decfdca7aa28582" Feb 02 10:44:53 crc kubenswrapper[4845]: I0202 10:44:53.775579 4845 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Feb 02 10:44:53 crc kubenswrapper[4845]: I0202 10:44:53.776453 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2d889e99-8118-4f52-ab20-b69a55bec079" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.214654 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq"] Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.215981 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.218088 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.218209 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.222726 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq"] Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.407940 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.407994 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.408262 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdvk9\" (UniqueName: \"kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.509565 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdvk9\" (UniqueName: \"kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.509655 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.509698 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.511643 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.516216 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.529439 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdvk9\" (UniqueName: \"kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9\") pod \"collect-profiles-29500485-wl2wq\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.547237 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:00 crc kubenswrapper[4845]: I0202 10:45:00.963064 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq"] Feb 02 10:45:01 crc kubenswrapper[4845]: I0202 10:45:01.224428 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" event={"ID":"6af5c06e-cf07-4f85-97e9-6b93ec03281c","Type":"ContainerStarted","Data":"a18e0c7c2dae09d3f5627d9a9acf7414e19ce7a97f56c94017a2bc4812f89130"} Feb 02 10:45:01 crc kubenswrapper[4845]: I0202 10:45:01.224470 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" event={"ID":"6af5c06e-cf07-4f85-97e9-6b93ec03281c","Type":"ContainerStarted","Data":"bc665438186f6e403402c20d6434c54ee1529c3a1c78753ddc00b1b116fb19df"} Feb 02 10:45:01 crc kubenswrapper[4845]: I0202 10:45:01.243204 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" podStartSLOduration=1.243177067 podStartE2EDuration="1.243177067s" podCreationTimestamp="2026-02-02 10:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:45:01.23844357 +0000 UTC m=+782.329845020" watchObservedRunningTime="2026-02-02 10:45:01.243177067 +0000 UTC m=+782.334578517" Feb 02 10:45:02 crc kubenswrapper[4845]: I0202 10:45:02.235430 4845 generic.go:334] "Generic (PLEG): container finished" podID="6af5c06e-cf07-4f85-97e9-6b93ec03281c" containerID="a18e0c7c2dae09d3f5627d9a9acf7414e19ce7a97f56c94017a2bc4812f89130" exitCode=0 Feb 02 10:45:02 crc kubenswrapper[4845]: I0202 10:45:02.235558 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" event={"ID":"6af5c06e-cf07-4f85-97e9-6b93ec03281c","Type":"ContainerDied","Data":"a18e0c7c2dae09d3f5627d9a9acf7414e19ce7a97f56c94017a2bc4812f89130"} Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.472280 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.653066 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume\") pod \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.653253 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdvk9\" (UniqueName: \"kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9\") pod \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.653306 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume\") pod \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\" (UID: \"6af5c06e-cf07-4f85-97e9-6b93ec03281c\") " Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.654267 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume" (OuterVolumeSpecName: "config-volume") pod "6af5c06e-cf07-4f85-97e9-6b93ec03281c" (UID: "6af5c06e-cf07-4f85-97e9-6b93ec03281c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.657589 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6af5c06e-cf07-4f85-97e9-6b93ec03281c" (UID: "6af5c06e-cf07-4f85-97e9-6b93ec03281c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.657659 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9" (OuterVolumeSpecName: "kube-api-access-qdvk9") pod "6af5c06e-cf07-4f85-97e9-6b93ec03281c" (UID: "6af5c06e-cf07-4f85-97e9-6b93ec03281c"). InnerVolumeSpecName "kube-api-access-qdvk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.755567 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdvk9\" (UniqueName: \"kubernetes.io/projected/6af5c06e-cf07-4f85-97e9-6b93ec03281c-kube-api-access-qdvk9\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.755634 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af5c06e-cf07-4f85-97e9-6b93ec03281c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.755645 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6af5c06e-cf07-4f85-97e9-6b93ec03281c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:03 crc kubenswrapper[4845]: I0202 10:45:03.774464 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Feb 02 10:45:04 crc kubenswrapper[4845]: I0202 10:45:04.255026 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" event={"ID":"6af5c06e-cf07-4f85-97e9-6b93ec03281c","Type":"ContainerDied","Data":"bc665438186f6e403402c20d6434c54ee1529c3a1c78753ddc00b1b116fb19df"} Feb 02 10:45:04 crc kubenswrapper[4845]: I0202 10:45:04.255656 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc665438186f6e403402c20d6434c54ee1529c3a1c78753ddc00b1b116fb19df" Feb 02 10:45:04 crc kubenswrapper[4845]: I0202 10:45:04.255730 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.231548 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-qdkf9"] Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.232404 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af5c06e-cf07-4f85-97e9-6b93ec03281c" containerName="collect-profiles" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.232420 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af5c06e-cf07-4f85-97e9-6b93ec03281c" containerName="collect-profiles" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.232565 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af5c06e-cf07-4f85-97e9-6b93ec03281c" containerName="collect-profiles" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.233134 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.235934 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-4mdxq" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.235957 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.236042 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.236685 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.239260 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.247806 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.252287 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-qdkf9"] Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362154 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzn7x\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362187 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362210 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362240 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362267 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362303 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362326 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362358 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362394 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.362414 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.385657 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-qdkf9"] Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.386250 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-fzn7x metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-qdkf9" podUID="4649e099-a892-4773-86ef-705fea600417" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465241 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465339 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465373 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465409 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465439 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465456 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465491 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465511 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzn7x\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465562 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.465589 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.466154 4845 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.466279 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver podName:4649e099-a892-4773-86ef-705fea600417 nodeName:}" failed. No retries permitted until 2026-02-02 10:45:22.966245731 +0000 UTC m=+804.057647191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver") pod "collector-qdkf9" (UID: "4649e099-a892-4773-86ef-705fea600417") : secret "collector-syslog-receiver" not found Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.466425 4845 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Feb 02 10:45:22 crc kubenswrapper[4845]: E0202 10:45:22.466499 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics podName:4649e099-a892-4773-86ef-705fea600417 nodeName:}" failed. No retries permitted until 2026-02-02 10:45:22.966479018 +0000 UTC m=+804.057880458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics") pod "collector-qdkf9" (UID: "4649e099-a892-4773-86ef-705fea600417") : secret "collector-metrics" not found Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.466735 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.466827 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.467489 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.467627 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.467923 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.477333 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.477704 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.491920 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzn7x\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.494397 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.973637 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.974005 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.978099 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:22 crc kubenswrapper[4845]: I0202 10:45:22.982958 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") pod \"collector-qdkf9\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " pod="openshift-logging/collector-qdkf9" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.395918 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdkf9" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.406642 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdkf9" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.581694 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582049 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582070 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582087 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582144 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582166 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582201 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582240 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582250 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir" (OuterVolumeSpecName: "datadir") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582331 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582353 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzn7x\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582429 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt\") pod \"4649e099-a892-4773-86ef-705fea600417\" (UID: \"4649e099-a892-4773-86ef-705fea600417\") " Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582717 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config" (OuterVolumeSpecName: "config") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.582968 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.583071 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.583084 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.583094 4845 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/4649e099-a892-4773-86ef-705fea600417-datadir\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.583107 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.585716 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics" (OuterVolumeSpecName: "metrics") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.585787 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token" (OuterVolumeSpecName: "sa-token") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.585917 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x" (OuterVolumeSpecName: "kube-api-access-fzn7x") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "kube-api-access-fzn7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.586116 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token" (OuterVolumeSpecName: "collector-token") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.586490 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.586652 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp" (OuterVolumeSpecName: "tmp") pod "4649e099-a892-4773-86ef-705fea600417" (UID: "4649e099-a892-4773-86ef-705fea600417"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685145 4845 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685183 4845 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4649e099-a892-4773-86ef-705fea600417-tmp\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685194 4845 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685205 4845 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685214 4845 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/4649e099-a892-4773-86ef-705fea600417-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685223 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685231 4845 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-entrypoint\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685239 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzn7x\" (UniqueName: \"kubernetes.io/projected/4649e099-a892-4773-86ef-705fea600417-kube-api-access-fzn7x\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:23 crc kubenswrapper[4845]: I0202 10:45:23.685251 4845 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/4649e099-a892-4773-86ef-705fea600417-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.401873 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-qdkf9" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.436329 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-qdkf9"] Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.451974 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-qdkf9"] Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.457041 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-bkwj8"] Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.458038 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.459924 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.462055 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.462296 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-4mdxq" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.462679 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.462941 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.471281 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.475875 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bkwj8"] Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.598686 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-trusted-ca\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.598767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-entrypoint\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.598822 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54453df2-b815-42be-9542-aef7eed68aeb-tmp\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.598873 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.598927 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw47z\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-kube-api-access-pw47z\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-syslog-receiver\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599032 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599088 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/54453df2-b815-42be-9542-aef7eed68aeb-datadir\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599114 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-metrics\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599252 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config-openshift-service-cacrt\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.599297 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-sa-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701091 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/54453df2-b815-42be-9542-aef7eed68aeb-datadir\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701148 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-metrics\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701200 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/54453df2-b815-42be-9542-aef7eed68aeb-datadir\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701295 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config-openshift-service-cacrt\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-sa-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701387 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-trusted-ca\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701411 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-entrypoint\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701432 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54453df2-b815-42be-9542-aef7eed68aeb-tmp\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701463 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw47z\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-kube-api-access-pw47z\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701533 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-syslog-receiver\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.701557 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.702631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-trusted-ca\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.702644 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.702678 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-entrypoint\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.702802 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/54453df2-b815-42be-9542-aef7eed68aeb-config-openshift-service-cacrt\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.710519 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-syslog-receiver\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.714409 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-metrics\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.714620 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/54453df2-b815-42be-9542-aef7eed68aeb-tmp\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.715581 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/54453df2-b815-42be-9542-aef7eed68aeb-collector-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.717690 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-sa-token\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.726589 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw47z\" (UniqueName: \"kubernetes.io/projected/54453df2-b815-42be-9542-aef7eed68aeb-kube-api-access-pw47z\") pod \"collector-bkwj8\" (UID: \"54453df2-b815-42be-9542-aef7eed68aeb\") " pod="openshift-logging/collector-bkwj8" Feb 02 10:45:24 crc kubenswrapper[4845]: I0202 10:45:24.782010 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-bkwj8" Feb 02 10:45:25 crc kubenswrapper[4845]: I0202 10:45:25.229952 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-bkwj8"] Feb 02 10:45:25 crc kubenswrapper[4845]: I0202 10:45:25.409492 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-bkwj8" event={"ID":"54453df2-b815-42be-9542-aef7eed68aeb","Type":"ContainerStarted","Data":"cf1be961458e23208d77e538c1db92cce617f00a58998df1c1158144ad2c8432"} Feb 02 10:45:25 crc kubenswrapper[4845]: I0202 10:45:25.723683 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4649e099-a892-4773-86ef-705fea600417" path="/var/lib/kubelet/pods/4649e099-a892-4773-86ef-705fea600417/volumes" Feb 02 10:45:32 crc kubenswrapper[4845]: I0202 10:45:32.464052 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-bkwj8" event={"ID":"54453df2-b815-42be-9542-aef7eed68aeb","Type":"ContainerStarted","Data":"3a32a6da534ae3eb91e672e8dae3cc2f5b718751b16f3faa4c4578bf1c0f5f3e"} Feb 02 10:45:32 crc kubenswrapper[4845]: I0202 10:45:32.487029 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-bkwj8" podStartSLOduration=1.964923389 podStartE2EDuration="8.487012178s" podCreationTimestamp="2026-02-02 10:45:24 +0000 UTC" firstStartedPulling="2026-02-02 10:45:25.246464009 +0000 UTC m=+806.337865459" lastFinishedPulling="2026-02-02 10:45:31.768552798 +0000 UTC m=+812.859954248" observedRunningTime="2026-02-02 10:45:32.483758584 +0000 UTC m=+813.575160034" watchObservedRunningTime="2026-02-02 10:45:32.487012178 +0000 UTC m=+813.578413628" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.372110 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq"] Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.374950 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.378023 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.445088 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq"] Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.520699 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.520782 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.520870 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scml7\" (UniqueName: \"kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.622817 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.622915 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.622972 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scml7\" (UniqueName: \"kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.623532 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.623825 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.643707 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scml7\" (UniqueName: \"kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:01 crc kubenswrapper[4845]: I0202 10:46:01.697768 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:02 crc kubenswrapper[4845]: I0202 10:46:02.146436 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq"] Feb 02 10:46:02 crc kubenswrapper[4845]: I0202 10:46:02.669924 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" event={"ID":"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92","Type":"ContainerStarted","Data":"0ed13668fba3d3c04be42e0559bf8f0a6f2a0025f1a528d3973c4070b9282195"} Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.677148 4845 generic.go:334] "Generic (PLEG): container finished" podID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerID="289acee830b3427c14016b1f2c49bd685323c61f36d437f388f65b1c9a1a61a8" exitCode=0 Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.677201 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" event={"ID":"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92","Type":"ContainerDied","Data":"289acee830b3427c14016b1f2c49bd685323c61f36d437f388f65b1c9a1a61a8"} Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.709731 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.711252 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.721395 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.770790 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhbf\" (UniqueName: \"kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.771098 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.771239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.873576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhbf\" (UniqueName: \"kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.873660 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.873702 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.874224 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.874301 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:03 crc kubenswrapper[4845]: I0202 10:46:03.894052 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhbf\" (UniqueName: \"kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf\") pod \"redhat-operators-k47lx\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:04 crc kubenswrapper[4845]: I0202 10:46:04.033846 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:04 crc kubenswrapper[4845]: I0202 10:46:04.508503 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:04 crc kubenswrapper[4845]: I0202 10:46:04.688036 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerStarted","Data":"f64227e8467858e78a947aa30dbb8e5c523b4bf1e03d44b55588307424690239"} Feb 02 10:46:05 crc kubenswrapper[4845]: I0202 10:46:05.696610 4845 generic.go:334] "Generic (PLEG): container finished" podID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerID="376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f" exitCode=0 Feb 02 10:46:05 crc kubenswrapper[4845]: I0202 10:46:05.696662 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerDied","Data":"376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f"} Feb 02 10:46:05 crc kubenswrapper[4845]: I0202 10:46:05.699573 4845 generic.go:334] "Generic (PLEG): container finished" podID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerID="77bf7f8d7a6baaa76df13bbe774d42150b872098bd56855ba76680ab82eff2d6" exitCode=0 Feb 02 10:46:05 crc kubenswrapper[4845]: I0202 10:46:05.699607 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" event={"ID":"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92","Type":"ContainerDied","Data":"77bf7f8d7a6baaa76df13bbe774d42150b872098bd56855ba76680ab82eff2d6"} Feb 02 10:46:06 crc kubenswrapper[4845]: I0202 10:46:06.706746 4845 generic.go:334] "Generic (PLEG): container finished" podID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerID="3727a44ded759bc6d1d03787f972a5d9f13741e7eab3d69e1d386762e48782aa" exitCode=0 Feb 02 10:46:06 crc kubenswrapper[4845]: I0202 10:46:06.707060 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" event={"ID":"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92","Type":"ContainerDied","Data":"3727a44ded759bc6d1d03787f972a5d9f13741e7eab3d69e1d386762e48782aa"} Feb 02 10:46:06 crc kubenswrapper[4845]: I0202 10:46:06.709176 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerStarted","Data":"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616"} Feb 02 10:46:07 crc kubenswrapper[4845]: I0202 10:46:07.718455 4845 generic.go:334] "Generic (PLEG): container finished" podID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerID="608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616" exitCode=0 Feb 02 10:46:07 crc kubenswrapper[4845]: I0202 10:46:07.727997 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerDied","Data":"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616"} Feb 02 10:46:07 crc kubenswrapper[4845]: I0202 10:46:07.972761 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.140248 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scml7\" (UniqueName: \"kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7\") pod \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.140411 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle\") pod \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.140445 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util\") pod \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\" (UID: \"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92\") " Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.152471 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle" (OuterVolumeSpecName: "bundle") pod "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" (UID: "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.158987 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7" (OuterVolumeSpecName: "kube-api-access-scml7") pod "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" (UID: "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92"). InnerVolumeSpecName "kube-api-access-scml7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.242801 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.242836 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scml7\" (UniqueName: \"kubernetes.io/projected/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-kube-api-access-scml7\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.264481 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util" (OuterVolumeSpecName: "util") pod "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" (UID: "cbe2dd1b-0b96-4fb7-8873-f9c1378bde92"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.344239 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cbe2dd1b-0b96-4fb7-8873-f9c1378bde92-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.727218 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerStarted","Data":"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c"} Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.729570 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" event={"ID":"cbe2dd1b-0b96-4fb7-8873-f9c1378bde92","Type":"ContainerDied","Data":"0ed13668fba3d3c04be42e0559bf8f0a6f2a0025f1a528d3973c4070b9282195"} Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.729613 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed13668fba3d3c04be42e0559bf8f0a6f2a0025f1a528d3973c4070b9282195" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.729620 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq" Feb 02 10:46:08 crc kubenswrapper[4845]: I0202 10:46:08.758183 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k47lx" podStartSLOduration=3.151007862 podStartE2EDuration="5.75815783s" podCreationTimestamp="2026-02-02 10:46:03 +0000 UTC" firstStartedPulling="2026-02-02 10:46:05.698264845 +0000 UTC m=+846.789666295" lastFinishedPulling="2026-02-02 10:46:08.305414813 +0000 UTC m=+849.396816263" observedRunningTime="2026-02-02 10:46:08.751065225 +0000 UTC m=+849.842466685" watchObservedRunningTime="2026-02-02 10:46:08.75815783 +0000 UTC m=+849.849559280" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.023714 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xpndf"] Feb 02 10:46:11 crc kubenswrapper[4845]: E0202 10:46:11.024295 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="util" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.024313 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="util" Feb 02 10:46:11 crc kubenswrapper[4845]: E0202 10:46:11.024350 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="pull" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.024358 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="pull" Feb 02 10:46:11 crc kubenswrapper[4845]: E0202 10:46:11.024371 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="extract" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.024379 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="extract" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.036219 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbe2dd1b-0b96-4fb7-8873-f9c1378bde92" containerName="extract" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.037415 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.040900 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.041069 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.041124 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-c6cxh" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.050845 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xpndf"] Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.186697 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg6t4\" (UniqueName: \"kubernetes.io/projected/17b0c917-994c-41bc-9fbf-6e9d86d65bca-kube-api-access-lg6t4\") pod \"nmstate-operator-646758c888-xpndf\" (UID: \"17b0c917-994c-41bc-9fbf-6e9d86d65bca\") " pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.289194 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg6t4\" (UniqueName: \"kubernetes.io/projected/17b0c917-994c-41bc-9fbf-6e9d86d65bca-kube-api-access-lg6t4\") pod \"nmstate-operator-646758c888-xpndf\" (UID: \"17b0c917-994c-41bc-9fbf-6e9d86d65bca\") " pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.312687 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg6t4\" (UniqueName: \"kubernetes.io/projected/17b0c917-994c-41bc-9fbf-6e9d86d65bca-kube-api-access-lg6t4\") pod \"nmstate-operator-646758c888-xpndf\" (UID: \"17b0c917-994c-41bc-9fbf-6e9d86d65bca\") " pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.410837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" Feb 02 10:46:11 crc kubenswrapper[4845]: I0202 10:46:11.997243 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-xpndf"] Feb 02 10:46:12 crc kubenswrapper[4845]: W0202 10:46:12.007906 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17b0c917_994c_41bc_9fbf_6e9d86d65bca.slice/crio-233d65503c4378f7c1557b5f02dbc6eb5cced22633f6b0ca4577a6370816cc5d WatchSource:0}: Error finding container 233d65503c4378f7c1557b5f02dbc6eb5cced22633f6b0ca4577a6370816cc5d: Status 404 returned error can't find the container with id 233d65503c4378f7c1557b5f02dbc6eb5cced22633f6b0ca4577a6370816cc5d Feb 02 10:46:12 crc kubenswrapper[4845]: I0202 10:46:12.763713 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" event={"ID":"17b0c917-994c-41bc-9fbf-6e9d86d65bca","Type":"ContainerStarted","Data":"233d65503c4378f7c1557b5f02dbc6eb5cced22633f6b0ca4577a6370816cc5d"} Feb 02 10:46:14 crc kubenswrapper[4845]: I0202 10:46:14.035039 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:14 crc kubenswrapper[4845]: I0202 10:46:14.035087 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:14 crc kubenswrapper[4845]: I0202 10:46:14.800195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" event={"ID":"17b0c917-994c-41bc-9fbf-6e9d86d65bca","Type":"ContainerStarted","Data":"5158065c1ba3f85ebdda886573c78529f7686a61d326243096b2ff309c90c42e"} Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.077519 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k47lx" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="registry-server" probeResult="failure" output=< Feb 02 10:46:15 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 10:46:15 crc kubenswrapper[4845]: > Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.856650 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-xpndf" podStartSLOduration=2.297230556 podStartE2EDuration="4.856629356s" podCreationTimestamp="2026-02-02 10:46:11 +0000 UTC" firstStartedPulling="2026-02-02 10:46:12.010510213 +0000 UTC m=+853.101911663" lastFinishedPulling="2026-02-02 10:46:14.569909013 +0000 UTC m=+855.661310463" observedRunningTime="2026-02-02 10:46:14.820337586 +0000 UTC m=+855.911739026" watchObservedRunningTime="2026-02-02 10:46:15.856629356 +0000 UTC m=+856.948030816" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.860171 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ksh5c"] Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.873787 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5"] Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.862294 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.907600 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-ks7bq"] Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.908472 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.908645 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.910753 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.915428 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-mw55t" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.915626 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ksh5c"] Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.920965 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5"] Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.965074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqp4z\" (UniqueName: \"kubernetes.io/projected/ed30c5ac-3449-4902-b948-34958198b224-kube-api-access-wqp4z\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.965520 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:15 crc kubenswrapper[4845]: I0202 10:46:15.965623 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nvfw\" (UniqueName: \"kubernetes.io/projected/65b8d7a7-4de6-4edc-b652-999572c3494a-kube-api-access-4nvfw\") pod \"nmstate-metrics-54757c584b-ksh5c\" (UID: \"65b8d7a7-4de6-4edc-b652-999572c3494a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.030398 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4"] Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.031514 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.034291 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jjxtg" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.035479 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.035626 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.055376 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4"] Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.067790 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66b5\" (UniqueName: \"kubernetes.io/projected/8c3ff69a-c422-491b-a933-0522f29d7e7c-kube-api-access-d66b5\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068395 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-nmstate-lock\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqp4z\" (UniqueName: \"kubernetes.io/projected/ed30c5ac-3449-4902-b948-34958198b224-kube-api-access-wqp4z\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068631 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-dbus-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068706 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068794 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-ovs-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.068875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nvfw\" (UniqueName: \"kubernetes.io/projected/65b8d7a7-4de6-4edc-b652-999572c3494a-kube-api-access-4nvfw\") pod \"nmstate-metrics-54757c584b-ksh5c\" (UID: \"65b8d7a7-4de6-4edc-b652-999572c3494a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" Feb 02 10:46:16 crc kubenswrapper[4845]: E0202 10:46:16.068934 4845 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 02 10:46:16 crc kubenswrapper[4845]: E0202 10:46:16.069221 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair podName:ed30c5ac-3449-4902-b948-34958198b224 nodeName:}" failed. No retries permitted until 2026-02-02 10:46:16.569200116 +0000 UTC m=+857.660601556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-k2dv5" (UID: "ed30c5ac-3449-4902-b948-34958198b224") : secret "openshift-nmstate-webhook" not found Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.097967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqp4z\" (UniqueName: \"kubernetes.io/projected/ed30c5ac-3449-4902-b948-34958198b224-kube-api-access-wqp4z\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.104136 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nvfw\" (UniqueName: \"kubernetes.io/projected/65b8d7a7-4de6-4edc-b652-999572c3494a-kube-api-access-4nvfw\") pod \"nmstate-metrics-54757c584b-ksh5c\" (UID: \"65b8d7a7-4de6-4edc-b652-999572c3494a\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170459 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66b5\" (UniqueName: \"kubernetes.io/projected/8c3ff69a-c422-491b-a933-0522f29d7e7c-kube-api-access-d66b5\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170541 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-nmstate-lock\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170599 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-dbus-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170630 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170683 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-ovs-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170728 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170790 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-229lr\" (UniqueName: \"kubernetes.io/projected/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-kube-api-access-229lr\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.170932 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-nmstate-lock\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.171035 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-ovs-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.171260 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8c3ff69a-c422-491b-a933-0522f29d7e7c-dbus-socket\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.194955 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66b5\" (UniqueName: \"kubernetes.io/projected/8c3ff69a-c422-491b-a933-0522f29d7e7c-kube-api-access-d66b5\") pod \"nmstate-handler-ks7bq\" (UID: \"8c3ff69a-c422-491b-a933-0522f29d7e7c\") " pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.229612 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.230652 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.237060 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.249234 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.277487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.277816 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.278029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-229lr\" (UniqueName: \"kubernetes.io/projected/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-kube-api-access-229lr\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.279548 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.289755 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.297080 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.328552 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-229lr\" (UniqueName: \"kubernetes.io/projected/f49a4fe2-aa60-4d14-a9bb-f13d0066a542-kube-api-access-229lr\") pod \"nmstate-console-plugin-7754f76f8b-2phr4\" (UID: \"f49a4fe2-aa60-4d14-a9bb-f13d0066a542\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.363001 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.380851 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.381147 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.381287 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.397572 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp6kt\" (UniqueName: \"kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.397628 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.397765 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.397855 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.499924 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500290 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500416 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500451 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp6kt\" (UniqueName: \"kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.500516 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.502066 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.506689 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.506951 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.507670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.514901 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.514916 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.520444 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp6kt\" (UniqueName: \"kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt\") pod \"console-58d87f97d7-w9v5x\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.602279 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.605643 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ed30c5ac-3449-4902-b948-34958198b224-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-k2dv5\" (UID: \"ed30c5ac-3449-4902-b948-34958198b224\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.634942 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.714858 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-ksh5c"] Feb 02 10:46:16 crc kubenswrapper[4845]: W0202 10:46:16.722514 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b8d7a7_4de6_4edc_b652_999572c3494a.slice/crio-4370510a95928b8ff4e998d2870157f06396ec4c3f14b64b55e2685d12d2e497 WatchSource:0}: Error finding container 4370510a95928b8ff4e998d2870157f06396ec4c3f14b64b55e2685d12d2e497: Status 404 returned error can't find the container with id 4370510a95928b8ff4e998d2870157f06396ec4c3f14b64b55e2685d12d2e497 Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.822084 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4"] Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.822247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ks7bq" event={"ID":"8c3ff69a-c422-491b-a933-0522f29d7e7c","Type":"ContainerStarted","Data":"0b396858010a9081426806fe57a0ca1beb98ee5e036d024acf2ec7bd2bdce074"} Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.823338 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" event={"ID":"65b8d7a7-4de6-4edc-b652-999572c3494a","Type":"ContainerStarted","Data":"4370510a95928b8ff4e998d2870157f06396ec4c3f14b64b55e2685d12d2e497"} Feb 02 10:46:16 crc kubenswrapper[4845]: W0202 10:46:16.828663 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf49a4fe2_aa60_4d14_a9bb_f13d0066a542.slice/crio-c11f12d1825f59697a4f6dadda31bbeed2e0a28a3b18e733bc9c274834f1aff5 WatchSource:0}: Error finding container c11f12d1825f59697a4f6dadda31bbeed2e0a28a3b18e733bc9c274834f1aff5: Status 404 returned error can't find the container with id c11f12d1825f59697a4f6dadda31bbeed2e0a28a3b18e733bc9c274834f1aff5 Feb 02 10:46:16 crc kubenswrapper[4845]: I0202 10:46:16.873474 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.137740 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.281989 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5"] Feb 02 10:46:17 crc kubenswrapper[4845]: W0202 10:46:17.285282 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded30c5ac_3449_4902_b948_34958198b224.slice/crio-b23b5dae7ddb77a1a149d83048e0cfec022d92d9b484987d3ab7f09e2305eec8 WatchSource:0}: Error finding container b23b5dae7ddb77a1a149d83048e0cfec022d92d9b484987d3ab7f09e2305eec8: Status 404 returned error can't find the container with id b23b5dae7ddb77a1a149d83048e0cfec022d92d9b484987d3ab7f09e2305eec8 Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.840874 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" event={"ID":"f49a4fe2-aa60-4d14-a9bb-f13d0066a542","Type":"ContainerStarted","Data":"c11f12d1825f59697a4f6dadda31bbeed2e0a28a3b18e733bc9c274834f1aff5"} Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.842416 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" event={"ID":"ed30c5ac-3449-4902-b948-34958198b224","Type":"ContainerStarted","Data":"b23b5dae7ddb77a1a149d83048e0cfec022d92d9b484987d3ab7f09e2305eec8"} Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.844358 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58d87f97d7-w9v5x" event={"ID":"c26d4007-db0b-4379-8431-d6e43dec7e9f","Type":"ContainerStarted","Data":"111832ec0e0c78c364956282064b05b1c4c2ce296de61e9f1fa4fa6702a3f91d"} Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.844386 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58d87f97d7-w9v5x" event={"ID":"c26d4007-db0b-4379-8431-d6e43dec7e9f","Type":"ContainerStarted","Data":"b03473270e278212cfa587bd83c780a50bc52c823bf36377b8f5efc441c8224f"} Feb 02 10:46:17 crc kubenswrapper[4845]: I0202 10:46:17.863421 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58d87f97d7-w9v5x" podStartSLOduration=1.863404506 podStartE2EDuration="1.863404506s" podCreationTimestamp="2026-02-02 10:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:46:17.86178748 +0000 UTC m=+858.953188930" watchObservedRunningTime="2026-02-02 10:46:17.863404506 +0000 UTC m=+858.954805976" Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.890868 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-ks7bq" event={"ID":"8c3ff69a-c422-491b-a933-0522f29d7e7c","Type":"ContainerStarted","Data":"8bd793cd3386f047b04124c1ccf954ba7cd2b919a283ffdd15eb8d80fea4832c"} Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.891467 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.895249 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" event={"ID":"f49a4fe2-aa60-4d14-a9bb-f13d0066a542","Type":"ContainerStarted","Data":"85113e54fd98a3708a9495d9caed953134b5b79bcc2ad0e13f44fc9ac1034690"} Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.897249 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" event={"ID":"ed30c5ac-3449-4902-b948-34958198b224","Type":"ContainerStarted","Data":"9d6f684a704674dd343455675354d91ac207516624064051e805d36c4322c9bd"} Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.898146 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.899840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" event={"ID":"65b8d7a7-4de6-4edc-b652-999572c3494a","Type":"ContainerStarted","Data":"e363f73f23b63099effb3423c08e9d4d424b14b7b8aef0396bc2fe795323940b"} Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.917591 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-ks7bq" podStartSLOduration=2.476387979 podStartE2EDuration="5.917568576s" podCreationTimestamp="2026-02-02 10:46:15 +0000 UTC" firstStartedPulling="2026-02-02 10:46:16.342376106 +0000 UTC m=+857.433777556" lastFinishedPulling="2026-02-02 10:46:19.783556703 +0000 UTC m=+860.874958153" observedRunningTime="2026-02-02 10:46:20.910230984 +0000 UTC m=+862.001632444" watchObservedRunningTime="2026-02-02 10:46:20.917568576 +0000 UTC m=+862.008970026" Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.944239 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" podStartSLOduration=3.31951046 podStartE2EDuration="5.944204535s" podCreationTimestamp="2026-02-02 10:46:15 +0000 UTC" firstStartedPulling="2026-02-02 10:46:17.290085158 +0000 UTC m=+858.381486608" lastFinishedPulling="2026-02-02 10:46:19.914779223 +0000 UTC m=+861.006180683" observedRunningTime="2026-02-02 10:46:20.931381975 +0000 UTC m=+862.022783465" watchObservedRunningTime="2026-02-02 10:46:20.944204535 +0000 UTC m=+862.035606025" Feb 02 10:46:20 crc kubenswrapper[4845]: I0202 10:46:20.955003 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-2phr4" podStartSLOduration=1.885785652 podStartE2EDuration="4.954977116s" podCreationTimestamp="2026-02-02 10:46:16 +0000 UTC" firstStartedPulling="2026-02-02 10:46:16.833630594 +0000 UTC m=+857.925032044" lastFinishedPulling="2026-02-02 10:46:19.902822058 +0000 UTC m=+860.994223508" observedRunningTime="2026-02-02 10:46:20.948221961 +0000 UTC m=+862.039623411" watchObservedRunningTime="2026-02-02 10:46:20.954977116 +0000 UTC m=+862.046378576" Feb 02 10:46:23 crc kubenswrapper[4845]: I0202 10:46:23.934553 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" event={"ID":"65b8d7a7-4de6-4edc-b652-999572c3494a","Type":"ContainerStarted","Data":"950542cdbf87ca57f2e0292c5ef95b0a5cd16010d2698dda2673389b91c50ca7"} Feb 02 10:46:23 crc kubenswrapper[4845]: I0202 10:46:23.967033 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-ksh5c" podStartSLOduration=2.243106171 podStartE2EDuration="8.967002999s" podCreationTimestamp="2026-02-02 10:46:15 +0000 UTC" firstStartedPulling="2026-02-02 10:46:16.731760402 +0000 UTC m=+857.823161852" lastFinishedPulling="2026-02-02 10:46:23.45565723 +0000 UTC m=+864.547058680" observedRunningTime="2026-02-02 10:46:23.9621884 +0000 UTC m=+865.053589860" watchObservedRunningTime="2026-02-02 10:46:23.967002999 +0000 UTC m=+865.058404459" Feb 02 10:46:24 crc kubenswrapper[4845]: I0202 10:46:24.090657 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:24 crc kubenswrapper[4845]: I0202 10:46:24.156573 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:24 crc kubenswrapper[4845]: I0202 10:46:24.331471 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:25 crc kubenswrapper[4845]: I0202 10:46:25.949780 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k47lx" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="registry-server" containerID="cri-o://addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c" gracePeriod=2 Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.277805 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-ks7bq" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.401629 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.497278 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxhbf\" (UniqueName: \"kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf\") pod \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.497677 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities\") pod \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.497704 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content\") pod \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\" (UID: \"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b\") " Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.498505 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities" (OuterVolumeSpecName: "utilities") pod "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" (UID: "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.502771 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf" (OuterVolumeSpecName: "kube-api-access-jxhbf") pod "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" (UID: "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b"). InnerVolumeSpecName "kube-api-access-jxhbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.599461 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxhbf\" (UniqueName: \"kubernetes.io/projected/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-kube-api-access-jxhbf\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.599491 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.614563 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" (UID: "a4a4a654-e5c6-4af9-905f-9d4e5b9d032b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.636021 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.636593 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.641354 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.700952 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.961023 4845 generic.go:334] "Generic (PLEG): container finished" podID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerID="addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c" exitCode=0 Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.961960 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerDied","Data":"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c"} Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.962002 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k47lx" event={"ID":"a4a4a654-e5c6-4af9-905f-9d4e5b9d032b","Type":"ContainerDied","Data":"f64227e8467858e78a947aa30dbb8e5c523b4bf1e03d44b55588307424690239"} Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.962033 4845 scope.go:117] "RemoveContainer" containerID="addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.962471 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k47lx" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.965595 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:46:26 crc kubenswrapper[4845]: I0202 10:46:26.979537 4845 scope.go:117] "RemoveContainer" containerID="608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.008168 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.015534 4845 scope.go:117] "RemoveContainer" containerID="376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.015707 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k47lx"] Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.024821 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.053212 4845 scope.go:117] "RemoveContainer" containerID="addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c" Feb 02 10:46:27 crc kubenswrapper[4845]: E0202 10:46:27.058226 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c\": container with ID starting with addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c not found: ID does not exist" containerID="addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.058267 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c"} err="failed to get container status \"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c\": rpc error: code = NotFound desc = could not find container \"addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c\": container with ID starting with addd9f19be4b234210edb94ea30984f742598880804586a6dc7f1aa2a0e91c0c not found: ID does not exist" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.058290 4845 scope.go:117] "RemoveContainer" containerID="608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616" Feb 02 10:46:27 crc kubenswrapper[4845]: E0202 10:46:27.074073 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616\": container with ID starting with 608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616 not found: ID does not exist" containerID="608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.074135 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616"} err="failed to get container status \"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616\": rpc error: code = NotFound desc = could not find container \"608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616\": container with ID starting with 608bd3ebd7985eae016e771757b5b2d25d25b990f65c352487ec67efd2034616 not found: ID does not exist" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.074168 4845 scope.go:117] "RemoveContainer" containerID="376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f" Feb 02 10:46:27 crc kubenswrapper[4845]: E0202 10:46:27.080958 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f\": container with ID starting with 376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f not found: ID does not exist" containerID="376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.081003 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f"} err="failed to get container status \"376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f\": rpc error: code = NotFound desc = could not find container \"376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f\": container with ID starting with 376394c283294d1180a4be60d934ca55a8664ab5d347e3d5ab14e268ca8f696f not found: ID does not exist" Feb 02 10:46:27 crc kubenswrapper[4845]: I0202 10:46:27.720784 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" path="/var/lib/kubelet/pods/a4a4a654-e5c6-4af9-905f-9d4e5b9d032b/volumes" Feb 02 10:46:36 crc kubenswrapper[4845]: I0202 10:46:36.879807 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-k2dv5" Feb 02 10:46:46 crc kubenswrapper[4845]: I0202 10:46:46.239034 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:46:46 crc kubenswrapper[4845]: I0202 10:46:46.239679 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.078936 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-dd6cc54dd-nz852" podUID="760b8b36-f06d-49ac-9de5-72b222f509d0" containerName="console" containerID="cri-o://57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d" gracePeriod=15 Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.490637 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd6cc54dd-nz852_760b8b36-f06d-49ac-9de5-72b222f509d0/console/0.log" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.491092 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.563287 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9qjk\" (UniqueName: \"kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.563352 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.563374 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.563427 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564264 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564281 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564325 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564320 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config" (OuterVolumeSpecName: "console-config") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564346 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.564463 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config\") pod \"760b8b36-f06d-49ac-9de5-72b222f509d0\" (UID: \"760b8b36-f06d-49ac-9de5-72b222f509d0\") " Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.565116 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.565269 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.565287 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.565296 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.565305 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/760b8b36-f06d-49ac-9de5-72b222f509d0-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.569440 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.572553 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk" (OuterVolumeSpecName: "kube-api-access-f9qjk") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "kube-api-access-f9qjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.579185 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "760b8b36-f06d-49ac-9de5-72b222f509d0" (UID: "760b8b36-f06d-49ac-9de5-72b222f509d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.667234 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.667278 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/760b8b36-f06d-49ac-9de5-72b222f509d0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:52 crc kubenswrapper[4845]: I0202 10:46:52.667291 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9qjk\" (UniqueName: \"kubernetes.io/projected/760b8b36-f06d-49ac-9de5-72b222f509d0-kube-api-access-f9qjk\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.159027 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dd6cc54dd-nz852_760b8b36-f06d-49ac-9de5-72b222f509d0/console/0.log" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.160189 4845 generic.go:334] "Generic (PLEG): container finished" podID="760b8b36-f06d-49ac-9de5-72b222f509d0" containerID="57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d" exitCode=2 Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.160272 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd6cc54dd-nz852" event={"ID":"760b8b36-f06d-49ac-9de5-72b222f509d0","Type":"ContainerDied","Data":"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d"} Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.160304 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dd6cc54dd-nz852" event={"ID":"760b8b36-f06d-49ac-9de5-72b222f509d0","Type":"ContainerDied","Data":"3bdbcedf982f353d1f33f9e0800794674194143b9b56bad7bb0604d71624ce6e"} Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.160321 4845 scope.go:117] "RemoveContainer" containerID="57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.160349 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dd6cc54dd-nz852" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.190974 4845 scope.go:117] "RemoveContainer" containerID="57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d" Feb 02 10:46:53 crc kubenswrapper[4845]: E0202 10:46:53.191383 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d\": container with ID starting with 57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d not found: ID does not exist" containerID="57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.191415 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d"} err="failed to get container status \"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d\": rpc error: code = NotFound desc = could not find container \"57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d\": container with ID starting with 57cc98cec108b0b86f1d4c2be0f04ee12886cfce29725cbcd66fe5fd1120952d not found: ID does not exist" Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.192624 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.199618 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dd6cc54dd-nz852"] Feb 02 10:46:53 crc kubenswrapper[4845]: I0202 10:46:53.722663 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760b8b36-f06d-49ac-9de5-72b222f509d0" path="/var/lib/kubelet/pods/760b8b36-f06d-49ac-9de5-72b222f509d0/volumes" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.244610 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr"] Feb 02 10:46:54 crc kubenswrapper[4845]: E0202 10:46:54.244960 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="extract-utilities" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.244976 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="extract-utilities" Feb 02 10:46:54 crc kubenswrapper[4845]: E0202 10:46:54.244987 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760b8b36-f06d-49ac-9de5-72b222f509d0" containerName="console" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.244992 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="760b8b36-f06d-49ac-9de5-72b222f509d0" containerName="console" Feb 02 10:46:54 crc kubenswrapper[4845]: E0202 10:46:54.245017 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="registry-server" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.245025 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="registry-server" Feb 02 10:46:54 crc kubenswrapper[4845]: E0202 10:46:54.245039 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="extract-content" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.245045 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="extract-content" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.245170 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="760b8b36-f06d-49ac-9de5-72b222f509d0" containerName="console" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.245182 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a4a654-e5c6-4af9-905f-9d4e5b9d032b" containerName="registry-server" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.246259 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.248778 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.256199 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr"] Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.289826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.289902 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csnr\" (UniqueName: \"kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.290148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.391482 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.391658 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.391744 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csnr\" (UniqueName: \"kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.392427 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.392590 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.409325 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csnr\" (UniqueName: \"kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:54 crc kubenswrapper[4845]: I0202 10:46:54.599833 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:46:55 crc kubenswrapper[4845]: I0202 10:46:55.041459 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr"] Feb 02 10:46:55 crc kubenswrapper[4845]: I0202 10:46:55.175562 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" event={"ID":"2d07b157-761c-4649-ace7-6b9e73636713","Type":"ContainerStarted","Data":"09c9cd81aed13fcd5104531d64e7e1a40b67eeb136b2d66bb6f9c7e1b59f5c3e"} Feb 02 10:46:55 crc kubenswrapper[4845]: E0202 10:46:55.343385 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d07b157_761c_4649_ace7_6b9e73636713.slice/crio-conmon-b90648732bdfc02c497925d9fdbbc6781a0df319de0ab6066f55a1ef80c84dcb.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:46:56 crc kubenswrapper[4845]: I0202 10:46:56.187117 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d07b157-761c-4649-ace7-6b9e73636713" containerID="b90648732bdfc02c497925d9fdbbc6781a0df319de0ab6066f55a1ef80c84dcb" exitCode=0 Feb 02 10:46:56 crc kubenswrapper[4845]: I0202 10:46:56.187162 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" event={"ID":"2d07b157-761c-4649-ace7-6b9e73636713","Type":"ContainerDied","Data":"b90648732bdfc02c497925d9fdbbc6781a0df319de0ab6066f55a1ef80c84dcb"} Feb 02 10:46:56 crc kubenswrapper[4845]: I0202 10:46:56.189201 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:46:58 crc kubenswrapper[4845]: I0202 10:46:58.201727 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d07b157-761c-4649-ace7-6b9e73636713" containerID="2b1d98ea217d455458a46c7c0d2afb20f91215b0d3f661d57ccfe2e6f26e5b60" exitCode=0 Feb 02 10:46:58 crc kubenswrapper[4845]: I0202 10:46:58.201819 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" event={"ID":"2d07b157-761c-4649-ace7-6b9e73636713","Type":"ContainerDied","Data":"2b1d98ea217d455458a46c7c0d2afb20f91215b0d3f661d57ccfe2e6f26e5b60"} Feb 02 10:46:59 crc kubenswrapper[4845]: I0202 10:46:59.212712 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d07b157-761c-4649-ace7-6b9e73636713" containerID="b198f234f40d1c00f2b84d737f03fa568d6dd9dd444a43d7dbae3d8ce0b39534" exitCode=0 Feb 02 10:46:59 crc kubenswrapper[4845]: I0202 10:46:59.212756 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" event={"ID":"2d07b157-761c-4649-ace7-6b9e73636713","Type":"ContainerDied","Data":"b198f234f40d1c00f2b84d737f03fa568d6dd9dd444a43d7dbae3d8ce0b39534"} Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.572611 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.599021 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util\") pod \"2d07b157-761c-4649-ace7-6b9e73636713\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.599252 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle\") pod \"2d07b157-761c-4649-ace7-6b9e73636713\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.599349 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7csnr\" (UniqueName: \"kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr\") pod \"2d07b157-761c-4649-ace7-6b9e73636713\" (UID: \"2d07b157-761c-4649-ace7-6b9e73636713\") " Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.602298 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle" (OuterVolumeSpecName: "bundle") pod "2d07b157-761c-4649-ace7-6b9e73636713" (UID: "2d07b157-761c-4649-ace7-6b9e73636713"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.613211 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr" (OuterVolumeSpecName: "kube-api-access-7csnr") pod "2d07b157-761c-4649-ace7-6b9e73636713" (UID: "2d07b157-761c-4649-ace7-6b9e73636713"). InnerVolumeSpecName "kube-api-access-7csnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.703253 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7csnr\" (UniqueName: \"kubernetes.io/projected/2d07b157-761c-4649-ace7-6b9e73636713-kube-api-access-7csnr\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.703298 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:00 crc kubenswrapper[4845]: I0202 10:47:00.930968 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util" (OuterVolumeSpecName: "util") pod "2d07b157-761c-4649-ace7-6b9e73636713" (UID: "2d07b157-761c-4649-ace7-6b9e73636713"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:01 crc kubenswrapper[4845]: I0202 10:47:01.007129 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d07b157-761c-4649-ace7-6b9e73636713-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:01 crc kubenswrapper[4845]: I0202 10:47:01.229148 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" event={"ID":"2d07b157-761c-4649-ace7-6b9e73636713","Type":"ContainerDied","Data":"09c9cd81aed13fcd5104531d64e7e1a40b67eeb136b2d66bb6f9c7e1b59f5c3e"} Feb 02 10:47:01 crc kubenswrapper[4845]: I0202 10:47:01.229195 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c9cd81aed13fcd5104531d64e7e1a40b67eeb136b2d66bb6f9c7e1b59f5c3e" Feb 02 10:47:01 crc kubenswrapper[4845]: I0202 10:47:01.229195 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.821780 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:03 crc kubenswrapper[4845]: E0202 10:47:03.822372 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="pull" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.822386 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="pull" Feb 02 10:47:03 crc kubenswrapper[4845]: E0202 10:47:03.822405 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="util" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.822413 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="util" Feb 02 10:47:03 crc kubenswrapper[4845]: E0202 10:47:03.822423 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="extract" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.822431 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="extract" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.822586 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d07b157-761c-4649-ace7-6b9e73636713" containerName="extract" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.825143 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.846409 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.948614 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhm8\" (UniqueName: \"kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.948724 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:03 crc kubenswrapper[4845]: I0202 10:47:03.948755 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.050000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhm8\" (UniqueName: \"kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.050394 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.050423 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.050849 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.050978 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.073497 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhm8\" (UniqueName: \"kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8\") pod \"community-operators-n9cnc\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.142627 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.224946 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.226517 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.232483 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.252996 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.253066 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.253127 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqjf\" (UniqueName: \"kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.359280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqjf\" (UniqueName: \"kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.359653 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.359794 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.360486 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.361693 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.408456 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqjf\" (UniqueName: \"kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf\") pod \"certified-operators-fg84s\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.576599 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:04 crc kubenswrapper[4845]: I0202 10:47:04.697368 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:04 crc kubenswrapper[4845]: W0202 10:47:04.707634 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61fbccca_6f3d_48c6_b052_63b8f73bb8fd.slice/crio-1da0eaa65c059fb15be33cf9aefc7849e6710f13d12947ce4734cbfccbc41cd1 WatchSource:0}: Error finding container 1da0eaa65c059fb15be33cf9aefc7849e6710f13d12947ce4734cbfccbc41cd1: Status 404 returned error can't find the container with id 1da0eaa65c059fb15be33cf9aefc7849e6710f13d12947ce4734cbfccbc41cd1 Feb 02 10:47:05 crc kubenswrapper[4845]: I0202 10:47:05.121121 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:05 crc kubenswrapper[4845]: W0202 10:47:05.126305 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a6b980_b780_4ec9_a2d3_4684981d8d4e.slice/crio-3408ca9957d7cb1deec24b0d6355736505a717add3896628e6c46db5dab8bf77 WatchSource:0}: Error finding container 3408ca9957d7cb1deec24b0d6355736505a717add3896628e6c46db5dab8bf77: Status 404 returned error can't find the container with id 3408ca9957d7cb1deec24b0d6355736505a717add3896628e6c46db5dab8bf77 Feb 02 10:47:05 crc kubenswrapper[4845]: I0202 10:47:05.277183 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerStarted","Data":"3408ca9957d7cb1deec24b0d6355736505a717add3896628e6c46db5dab8bf77"} Feb 02 10:47:05 crc kubenswrapper[4845]: I0202 10:47:05.279040 4845 generic.go:334] "Generic (PLEG): container finished" podID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerID="4995da76a047f6c49948e06b6d40f72683aaa6ed41b78182d2a96f15171c01f1" exitCode=0 Feb 02 10:47:05 crc kubenswrapper[4845]: I0202 10:47:05.279094 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerDied","Data":"4995da76a047f6c49948e06b6d40f72683aaa6ed41b78182d2a96f15171c01f1"} Feb 02 10:47:05 crc kubenswrapper[4845]: I0202 10:47:05.279128 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerStarted","Data":"1da0eaa65c059fb15be33cf9aefc7849e6710f13d12947ce4734cbfccbc41cd1"} Feb 02 10:47:06 crc kubenswrapper[4845]: I0202 10:47:06.288919 4845 generic.go:334] "Generic (PLEG): container finished" podID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerID="df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63" exitCode=0 Feb 02 10:47:06 crc kubenswrapper[4845]: I0202 10:47:06.289020 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerDied","Data":"df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63"} Feb 02 10:47:06 crc kubenswrapper[4845]: I0202 10:47:06.293989 4845 generic.go:334] "Generic (PLEG): container finished" podID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerID="5e9b128387b5628d756bc68b8474bbcdad91b25a1b63ae10551ab7d080e9632b" exitCode=0 Feb 02 10:47:06 crc kubenswrapper[4845]: I0202 10:47:06.294050 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerDied","Data":"5e9b128387b5628d756bc68b8474bbcdad91b25a1b63ae10551ab7d080e9632b"} Feb 02 10:47:07 crc kubenswrapper[4845]: I0202 10:47:07.303689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerStarted","Data":"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7"} Feb 02 10:47:07 crc kubenswrapper[4845]: I0202 10:47:07.305910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerStarted","Data":"328f5b00453f26dbc88c2ec506227c6487985622610eca8cadc30738653842c0"} Feb 02 10:47:07 crc kubenswrapper[4845]: I0202 10:47:07.346262 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n9cnc" podStartSLOduration=2.913642594 podStartE2EDuration="4.346241911s" podCreationTimestamp="2026-02-02 10:47:03 +0000 UTC" firstStartedPulling="2026-02-02 10:47:05.280649052 +0000 UTC m=+906.372050502" lastFinishedPulling="2026-02-02 10:47:06.713248369 +0000 UTC m=+907.804649819" observedRunningTime="2026-02-02 10:47:07.33930265 +0000 UTC m=+908.430704120" watchObservedRunningTime="2026-02-02 10:47:07.346241911 +0000 UTC m=+908.437643361" Feb 02 10:47:08 crc kubenswrapper[4845]: I0202 10:47:08.314504 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerDied","Data":"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7"} Feb 02 10:47:08 crc kubenswrapper[4845]: I0202 10:47:08.314394 4845 generic.go:334] "Generic (PLEG): container finished" podID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerID="b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7" exitCode=0 Feb 02 10:47:09 crc kubenswrapper[4845]: I0202 10:47:09.324295 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerStarted","Data":"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba"} Feb 02 10:47:09 crc kubenswrapper[4845]: I0202 10:47:09.345980 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fg84s" podStartSLOduration=2.923789457 podStartE2EDuration="5.345961163s" podCreationTimestamp="2026-02-02 10:47:04 +0000 UTC" firstStartedPulling="2026-02-02 10:47:06.29434227 +0000 UTC m=+907.385743720" lastFinishedPulling="2026-02-02 10:47:08.716513976 +0000 UTC m=+909.807915426" observedRunningTime="2026-02-02 10:47:09.342296538 +0000 UTC m=+910.433697988" watchObservedRunningTime="2026-02-02 10:47:09.345961163 +0000 UTC m=+910.437362613" Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.811922 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.813686 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.824484 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.971918 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.972005 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:10 crc kubenswrapper[4845]: I0202 10:47:10.972036 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs2v\" (UniqueName: \"kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.073988 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.074032 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.074051 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs2v\" (UniqueName: \"kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.074493 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.074616 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.096464 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs2v\" (UniqueName: \"kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v\") pod \"redhat-marketplace-b5lf6\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.175820 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:11 crc kubenswrapper[4845]: I0202 10:47:11.523160 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.349696 4845 generic.go:334] "Generic (PLEG): container finished" podID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerID="3fd971db0c565ac1d8cf02bc0f6981f66795692b2f573fc704b7bb4ba2cf3eef" exitCode=0 Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.349905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerDied","Data":"3fd971db0c565ac1d8cf02bc0f6981f66795692b2f573fc704b7bb4ba2cf3eef"} Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.350018 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerStarted","Data":"050f1387b1bafaa0eb6a0d18bdb89af0db564a1754a0812c0ed9d00b6642e72b"} Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.680130 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-bcff8566-gkqml"] Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.681453 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.689291 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.689511 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.689718 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.689849 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.700277 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-s8tzj" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.767121 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bcff8566-gkqml"] Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.799195 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-webhook-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.799301 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-apiservice-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.799323 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfxr\" (UniqueName: \"kubernetes.io/projected/71926ac8-4fc3-41de-8295-01c8ddbb9d27-kube-api-access-pgfxr\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.900706 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-apiservice-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.900741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfxr\" (UniqueName: \"kubernetes.io/projected/71926ac8-4fc3-41de-8295-01c8ddbb9d27-kube-api-access-pgfxr\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.900819 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-webhook-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.906346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-webhook-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.906872 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/71926ac8-4fc3-41de-8295-01c8ddbb9d27-apiservice-cert\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.919567 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfxr\" (UniqueName: \"kubernetes.io/projected/71926ac8-4fc3-41de-8295-01c8ddbb9d27-kube-api-access-pgfxr\") pod \"metallb-operator-controller-manager-bcff8566-gkqml\" (UID: \"71926ac8-4fc3-41de-8295-01c8ddbb9d27\") " pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.951583 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn"] Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.953870 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.956869 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.957967 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.961208 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tc85c" Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.970364 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn"] Feb 02 10:47:12 crc kubenswrapper[4845]: I0202 10:47:12.999556 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.103935 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqcf\" (UniqueName: \"kubernetes.io/projected/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-kube-api-access-jgqcf\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.104025 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-webhook-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.104065 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-apiservice-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.206855 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqcf\" (UniqueName: \"kubernetes.io/projected/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-kube-api-access-jgqcf\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.207001 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-webhook-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.207060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-apiservice-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.216135 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-apiservice-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.216216 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-webhook-cert\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.225584 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqcf\" (UniqueName: \"kubernetes.io/projected/d2f82fb6-ff9c-4578-8e8c-2bc454b09927-kube-api-access-jgqcf\") pod \"metallb-operator-webhook-server-66c6bb874c-q55bn\" (UID: \"d2f82fb6-ff9c-4578-8e8c-2bc454b09927\") " pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.308350 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.380612 4845 generic.go:334] "Generic (PLEG): container finished" podID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerID="25c9408a498bdfbac908ba7a192f1dcdf10cbbe5bf9963596e569aea99ba57cb" exitCode=0 Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.380653 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerDied","Data":"25c9408a498bdfbac908ba7a192f1dcdf10cbbe5bf9963596e569aea99ba57cb"} Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.587191 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-bcff8566-gkqml"] Feb 02 10:47:13 crc kubenswrapper[4845]: W0202 10:47:13.601003 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71926ac8_4fc3_41de_8295_01c8ddbb9d27.slice/crio-2ec9f47aecbf4d2fd313bd04c10fcc7b70a3bc6ba5a5d33ac1f39ec856ef3bf5 WatchSource:0}: Error finding container 2ec9f47aecbf4d2fd313bd04c10fcc7b70a3bc6ba5a5d33ac1f39ec856ef3bf5: Status 404 returned error can't find the container with id 2ec9f47aecbf4d2fd313bd04c10fcc7b70a3bc6ba5a5d33ac1f39ec856ef3bf5 Feb 02 10:47:13 crc kubenswrapper[4845]: I0202 10:47:13.861654 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn"] Feb 02 10:47:13 crc kubenswrapper[4845]: W0202 10:47:13.867730 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2f82fb6_ff9c_4578_8e8c_2bc454b09927.slice/crio-cbbefb99d3c80ef89be4f60ac093af5fdce6e0a78fc0d13e2eaaff7de027a1ae WatchSource:0}: Error finding container cbbefb99d3c80ef89be4f60ac093af5fdce6e0a78fc0d13e2eaaff7de027a1ae: Status 404 returned error can't find the container with id cbbefb99d3c80ef89be4f60ac093af5fdce6e0a78fc0d13e2eaaff7de027a1ae Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.143465 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.143572 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.185733 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.389809 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" event={"ID":"d2f82fb6-ff9c-4578-8e8c-2bc454b09927","Type":"ContainerStarted","Data":"cbbefb99d3c80ef89be4f60ac093af5fdce6e0a78fc0d13e2eaaff7de027a1ae"} Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.390945 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" event={"ID":"71926ac8-4fc3-41de-8295-01c8ddbb9d27","Type":"ContainerStarted","Data":"2ec9f47aecbf4d2fd313bd04c10fcc7b70a3bc6ba5a5d33ac1f39ec856ef3bf5"} Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.393166 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerStarted","Data":"5fb31abd9b4de4f2255ab9a57011ca0b111c55d34d92ac3f568d3966ce45afe3"} Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.443836 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.469003 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b5lf6" podStartSLOduration=3.014943005 podStartE2EDuration="4.46898657s" podCreationTimestamp="2026-02-02 10:47:10 +0000 UTC" firstStartedPulling="2026-02-02 10:47:12.353091186 +0000 UTC m=+913.444492636" lastFinishedPulling="2026-02-02 10:47:13.807134751 +0000 UTC m=+914.898536201" observedRunningTime="2026-02-02 10:47:14.421959424 +0000 UTC m=+915.513360884" watchObservedRunningTime="2026-02-02 10:47:14.46898657 +0000 UTC m=+915.560388020" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.577547 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.577613 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:14 crc kubenswrapper[4845]: I0202 10:47:14.622340 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:15 crc kubenswrapper[4845]: I0202 10:47:15.495273 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:16 crc kubenswrapper[4845]: I0202 10:47:16.237460 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:47:16 crc kubenswrapper[4845]: I0202 10:47:16.237541 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:47:17 crc kubenswrapper[4845]: I0202 10:47:17.419250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" event={"ID":"71926ac8-4fc3-41de-8295-01c8ddbb9d27","Type":"ContainerStarted","Data":"f949d5829fbc5e7934f72b74feac3639d58843e290be8dbeeb04e2733d0cea01"} Feb 02 10:47:17 crc kubenswrapper[4845]: I0202 10:47:17.419746 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:18 crc kubenswrapper[4845]: I0202 10:47:18.000326 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" podStartSLOduration=2.708828863 podStartE2EDuration="6.000307793s" podCreationTimestamp="2026-02-02 10:47:12 +0000 UTC" firstStartedPulling="2026-02-02 10:47:13.609119093 +0000 UTC m=+914.700520543" lastFinishedPulling="2026-02-02 10:47:16.900598023 +0000 UTC m=+917.991999473" observedRunningTime="2026-02-02 10:47:17.442383941 +0000 UTC m=+918.533785401" watchObservedRunningTime="2026-02-02 10:47:18.000307793 +0000 UTC m=+919.091709243" Feb 02 10:47:18 crc kubenswrapper[4845]: I0202 10:47:18.016432 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:18 crc kubenswrapper[4845]: I0202 10:47:18.016670 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n9cnc" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="registry-server" containerID="cri-o://328f5b00453f26dbc88c2ec506227c6487985622610eca8cadc30738653842c0" gracePeriod=2 Feb 02 10:47:18 crc kubenswrapper[4845]: I0202 10:47:18.438611 4845 generic.go:334] "Generic (PLEG): container finished" podID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerID="328f5b00453f26dbc88c2ec506227c6487985622610eca8cadc30738653842c0" exitCode=0 Feb 02 10:47:18 crc kubenswrapper[4845]: I0202 10:47:18.438693 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerDied","Data":"328f5b00453f26dbc88c2ec506227c6487985622610eca8cadc30738653842c0"} Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.153786 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.308224 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhm8\" (UniqueName: \"kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8\") pod \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.308396 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities\") pod \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.308416 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content\") pod \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\" (UID: \"61fbccca-6f3d-48c6-b052-63b8f73bb8fd\") " Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.309367 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities" (OuterVolumeSpecName: "utilities") pod "61fbccca-6f3d-48c6-b052-63b8f73bb8fd" (UID: "61fbccca-6f3d-48c6-b052-63b8f73bb8fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.313657 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8" (OuterVolumeSpecName: "kube-api-access-mmhm8") pod "61fbccca-6f3d-48c6-b052-63b8f73bb8fd" (UID: "61fbccca-6f3d-48c6-b052-63b8f73bb8fd"). InnerVolumeSpecName "kube-api-access-mmhm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.358390 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61fbccca-6f3d-48c6-b052-63b8f73bb8fd" (UID: "61fbccca-6f3d-48c6-b052-63b8f73bb8fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.406267 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.406540 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fg84s" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="registry-server" containerID="cri-o://fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba" gracePeriod=2 Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.413083 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.413123 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.413139 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmhm8\" (UniqueName: \"kubernetes.io/projected/61fbccca-6f3d-48c6-b052-63b8f73bb8fd-kube-api-access-mmhm8\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.448843 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n9cnc" event={"ID":"61fbccca-6f3d-48c6-b052-63b8f73bb8fd","Type":"ContainerDied","Data":"1da0eaa65c059fb15be33cf9aefc7849e6710f13d12947ce4734cbfccbc41cd1"} Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.449905 4845 scope.go:117] "RemoveContainer" containerID="328f5b00453f26dbc88c2ec506227c6487985622610eca8cadc30738653842c0" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.449086 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n9cnc" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.467864 4845 scope.go:117] "RemoveContainer" containerID="5e9b128387b5628d756bc68b8474bbcdad91b25a1b63ae10551ab7d080e9632b" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.482228 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.487243 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n9cnc"] Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.498188 4845 scope.go:117] "RemoveContainer" containerID="4995da76a047f6c49948e06b6d40f72683aaa6ed41b78182d2a96f15171c01f1" Feb 02 10:47:19 crc kubenswrapper[4845]: I0202 10:47:19.727513 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" path="/var/lib/kubelet/pods/61fbccca-6f3d-48c6-b052-63b8f73bb8fd/volumes" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.379767 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.463176 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" event={"ID":"d2f82fb6-ff9c-4578-8e8c-2bc454b09927","Type":"ContainerStarted","Data":"6042840959df0f8bfe6fec30c05a75456fc9bc882d4b4b13985926fc73ee637c"} Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.464318 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.466971 4845 generic.go:334] "Generic (PLEG): container finished" podID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerID="fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba" exitCode=0 Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.467014 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fg84s" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.467013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerDied","Data":"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba"} Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.467135 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fg84s" event={"ID":"45a6b980-b780-4ec9-a2d3-4684981d8d4e","Type":"ContainerDied","Data":"3408ca9957d7cb1deec24b0d6355736505a717add3896628e6c46db5dab8bf77"} Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.467155 4845 scope.go:117] "RemoveContainer" containerID="fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.495483 4845 scope.go:117] "RemoveContainer" containerID="b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.508935 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" podStartSLOduration=3.405865535 podStartE2EDuration="8.508916466s" podCreationTimestamp="2026-02-02 10:47:12 +0000 UTC" firstStartedPulling="2026-02-02 10:47:13.870817127 +0000 UTC m=+914.962218577" lastFinishedPulling="2026-02-02 10:47:18.973868048 +0000 UTC m=+920.065269508" observedRunningTime="2026-02-02 10:47:20.507279349 +0000 UTC m=+921.598680809" watchObservedRunningTime="2026-02-02 10:47:20.508916466 +0000 UTC m=+921.600317916" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.525995 4845 scope.go:117] "RemoveContainer" containerID="df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.538647 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qqjf\" (UniqueName: \"kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf\") pod \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.538724 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities\") pod \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.538859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content\") pod \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\" (UID: \"45a6b980-b780-4ec9-a2d3-4684981d8d4e\") " Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.539585 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities" (OuterVolumeSpecName: "utilities") pod "45a6b980-b780-4ec9-a2d3-4684981d8d4e" (UID: "45a6b980-b780-4ec9-a2d3-4684981d8d4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.539770 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.560039 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf" (OuterVolumeSpecName: "kube-api-access-7qqjf") pod "45a6b980-b780-4ec9-a2d3-4684981d8d4e" (UID: "45a6b980-b780-4ec9-a2d3-4684981d8d4e"). InnerVolumeSpecName "kube-api-access-7qqjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.566211 4845 scope.go:117] "RemoveContainer" containerID="fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba" Feb 02 10:47:20 crc kubenswrapper[4845]: E0202 10:47:20.567383 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba\": container with ID starting with fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba not found: ID does not exist" containerID="fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.567444 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba"} err="failed to get container status \"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba\": rpc error: code = NotFound desc = could not find container \"fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba\": container with ID starting with fbd230db849d847c74dbb0a41b7f17acfbe8042a249c671b2fd77745fa5f64ba not found: ID does not exist" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.567476 4845 scope.go:117] "RemoveContainer" containerID="b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7" Feb 02 10:47:20 crc kubenswrapper[4845]: E0202 10:47:20.567936 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7\": container with ID starting with b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7 not found: ID does not exist" containerID="b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.567961 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7"} err="failed to get container status \"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7\": rpc error: code = NotFound desc = could not find container \"b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7\": container with ID starting with b0ceb49b8726237f34fa694bf63e6f58856f9d6185fdbb98eeee756180f77bf7 not found: ID does not exist" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.567978 4845 scope.go:117] "RemoveContainer" containerID="df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63" Feb 02 10:47:20 crc kubenswrapper[4845]: E0202 10:47:20.568233 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63\": container with ID starting with df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63 not found: ID does not exist" containerID="df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.568263 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63"} err="failed to get container status \"df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63\": rpc error: code = NotFound desc = could not find container \"df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63\": container with ID starting with df0b246465b706b51c1fcb948f2c4342e4b8b5dcf4ac1ede2bbc2209c23ddd63 not found: ID does not exist" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.610099 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45a6b980-b780-4ec9-a2d3-4684981d8d4e" (UID: "45a6b980-b780-4ec9-a2d3-4684981d8d4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.641502 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qqjf\" (UniqueName: \"kubernetes.io/projected/45a6b980-b780-4ec9-a2d3-4684981d8d4e-kube-api-access-7qqjf\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.641550 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45a6b980-b780-4ec9-a2d3-4684981d8d4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.814062 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:20 crc kubenswrapper[4845]: I0202 10:47:20.820970 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fg84s"] Feb 02 10:47:21 crc kubenswrapper[4845]: I0202 10:47:21.176499 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:21 crc kubenswrapper[4845]: I0202 10:47:21.176556 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:21 crc kubenswrapper[4845]: I0202 10:47:21.216519 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:21 crc kubenswrapper[4845]: I0202 10:47:21.518650 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:21 crc kubenswrapper[4845]: I0202 10:47:21.720701 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" path="/var/lib/kubelet/pods/45a6b980-b780-4ec9-a2d3-4684981d8d4e/volumes" Feb 02 10:47:24 crc kubenswrapper[4845]: I0202 10:47:24.199798 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:24 crc kubenswrapper[4845]: I0202 10:47:24.200293 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b5lf6" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="registry-server" containerID="cri-o://5fb31abd9b4de4f2255ab9a57011ca0b111c55d34d92ac3f568d3966ce45afe3" gracePeriod=2 Feb 02 10:47:24 crc kubenswrapper[4845]: I0202 10:47:24.495813 4845 generic.go:334] "Generic (PLEG): container finished" podID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerID="5fb31abd9b4de4f2255ab9a57011ca0b111c55d34d92ac3f568d3966ce45afe3" exitCode=0 Feb 02 10:47:24 crc kubenswrapper[4845]: I0202 10:47:24.495854 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerDied","Data":"5fb31abd9b4de4f2255ab9a57011ca0b111c55d34d92ac3f568d3966ce45afe3"} Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.341879 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.414124 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content\") pod \"63bd6954-f4d1-44ff-9b92-074c115afffc\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.414563 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqs2v\" (UniqueName: \"kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v\") pod \"63bd6954-f4d1-44ff-9b92-074c115afffc\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.414725 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities\") pod \"63bd6954-f4d1-44ff-9b92-074c115afffc\" (UID: \"63bd6954-f4d1-44ff-9b92-074c115afffc\") " Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.415931 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities" (OuterVolumeSpecName: "utilities") pod "63bd6954-f4d1-44ff-9b92-074c115afffc" (UID: "63bd6954-f4d1-44ff-9b92-074c115afffc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.425446 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v" (OuterVolumeSpecName: "kube-api-access-fqs2v") pod "63bd6954-f4d1-44ff-9b92-074c115afffc" (UID: "63bd6954-f4d1-44ff-9b92-074c115afffc"). InnerVolumeSpecName "kube-api-access-fqs2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.449547 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63bd6954-f4d1-44ff-9b92-074c115afffc" (UID: "63bd6954-f4d1-44ff-9b92-074c115afffc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.512082 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b5lf6" event={"ID":"63bd6954-f4d1-44ff-9b92-074c115afffc","Type":"ContainerDied","Data":"050f1387b1bafaa0eb6a0d18bdb89af0db564a1754a0812c0ed9d00b6642e72b"} Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.512174 4845 scope.go:117] "RemoveContainer" containerID="5fb31abd9b4de4f2255ab9a57011ca0b111c55d34d92ac3f568d3966ce45afe3" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.512581 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b5lf6" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.516746 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.516777 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63bd6954-f4d1-44ff-9b92-074c115afffc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.516789 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqs2v\" (UniqueName: \"kubernetes.io/projected/63bd6954-f4d1-44ff-9b92-074c115afffc-kube-api-access-fqs2v\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.529999 4845 scope.go:117] "RemoveContainer" containerID="25c9408a498bdfbac908ba7a192f1dcdf10cbbe5bf9963596e569aea99ba57cb" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.550909 4845 scope.go:117] "RemoveContainer" containerID="3fd971db0c565ac1d8cf02bc0f6981f66795692b2f573fc704b7bb4ba2cf3eef" Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.562561 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.570773 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b5lf6"] Feb 02 10:47:25 crc kubenswrapper[4845]: I0202 10:47:25.723170 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" path="/var/lib/kubelet/pods/63bd6954-f4d1-44ff-9b92-074c115afffc/volumes" Feb 02 10:47:33 crc kubenswrapper[4845]: I0202 10:47:33.312404 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66c6bb874c-q55bn" Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.238170 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.238482 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.238526 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.239189 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.239235 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd" gracePeriod=600 Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.667419 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd" exitCode=0 Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.667464 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd"} Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.667838 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5"} Feb 02 10:47:46 crc kubenswrapper[4845]: I0202 10:47:46.667868 4845 scope.go:117] "RemoveContainer" containerID="faf8e85b5f2efdb91a1dcdfb7d3d9ff033956bb15922ba78cb0d90c0661d34f8" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.011822 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-bcff8566-gkqml" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.695670 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b"] Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696317 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696341 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696363 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696373 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696417 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696426 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696441 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696449 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696465 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696473 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696489 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696499 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696521 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696530 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="extract-utilities" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696551 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696559 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.696585 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696594 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="extract-content" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696779 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="61fbccca-6f3d-48c6-b052-63b8f73bb8fd" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696799 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a6b980-b780-4ec9-a2d3-4684981d8d4e" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.696811 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bd6954-f4d1-44ff-9b92-074c115afffc" containerName="registry-server" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.697725 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.700006 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.700142 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-dwpkm" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.707530 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bnlrj"] Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.717374 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.725091 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.728290 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b"] Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.728979 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.783614 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7gchc"] Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.786528 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.789496 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.789787 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5xjw2" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.789613 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.791429 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.792333 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lmzm\" (UniqueName: \"kubernetes.io/projected/8e70dfea-db96-43f0-82ea-e9342326f82f-kube-api-access-8lmzm\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.792466 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.799841 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-pwcrt"] Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.801120 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.802904 4845 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.813580 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-pwcrt"] Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.893717 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-conf\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.893986 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-sockets\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894087 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics-certs\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894171 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894248 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-reloader\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894340 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/572a79b7-a042-4090-afdd-924cdb0f9d3e-kube-api-access-njwnj\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894414 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae8d6393-e53b-4acc-9a90-094d95e29c03-metallb-excludel2\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894587 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894732 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chlb\" (UniqueName: \"kubernetes.io/projected/64760ce4-85d6-4e58-aa77-99c1ca4d936e-kube-api-access-6chlb\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894853 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-metrics-certs\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-cert\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894930 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvw5\" (UniqueName: \"kubernetes.io/projected/ae8d6393-e53b-4acc-9a90-094d95e29c03-kube-api-access-qbvw5\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.894957 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lmzm\" (UniqueName: \"kubernetes.io/projected/8e70dfea-db96-43f0-82ea-e9342326f82f-kube-api-access-8lmzm\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.894865 4845 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.895106 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-startup\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.895310 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert podName:8e70dfea-db96-43f0-82ea-e9342326f82f nodeName:}" failed. No retries permitted until 2026-02-02 10:47:54.395176566 +0000 UTC m=+955.486578016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert") pod "frr-k8s-webhook-server-7df86c4f6c-hd78b" (UID: "8e70dfea-db96-43f0-82ea-e9342326f82f") : secret "frr-k8s-webhook-server-cert" not found Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.895455 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-metrics-certs\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.895555 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.920622 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lmzm\" (UniqueName: \"kubernetes.io/projected/8e70dfea-db96-43f0-82ea-e9342326f82f-kube-api-access-8lmzm\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997081 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-cert\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997142 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvw5\" (UniqueName: \"kubernetes.io/projected/ae8d6393-e53b-4acc-9a90-094d95e29c03-kube-api-access-qbvw5\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997188 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-startup\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997222 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-metrics-certs\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997244 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997269 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-conf\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997294 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-sockets\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997311 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics-certs\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997337 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-reloader\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997442 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/572a79b7-a042-4090-afdd-924cdb0f9d3e-kube-api-access-njwnj\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997474 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae8d6393-e53b-4acc-9a90-094d95e29c03-metallb-excludel2\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997548 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chlb\" (UniqueName: \"kubernetes.io/projected/64760ce4-85d6-4e58-aa77-99c1ca4d936e-kube-api-access-6chlb\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.997594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-metrics-certs\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.998488 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-sockets\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.998502 4845 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:47:53 crc kubenswrapper[4845]: E0202 10:47:53.998576 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist podName:ae8d6393-e53b-4acc-9a90-094d95e29c03 nodeName:}" failed. No retries permitted until 2026-02-02 10:47:54.498556816 +0000 UTC m=+955.589958356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist") pod "speaker-7gchc" (UID: "ae8d6393-e53b-4acc-9a90-094d95e29c03") : secret "metallb-memberlist" not found Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.998752 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-reloader\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.998792 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.998877 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-conf\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.999142 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ae8d6393-e53b-4acc-9a90-094d95e29c03-metallb-excludel2\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:53 crc kubenswrapper[4845]: I0202 10:47:53.999719 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/572a79b7-a042-4090-afdd-924cdb0f9d3e-frr-startup\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.001453 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-metrics-certs\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.002561 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/572a79b7-a042-4090-afdd-924cdb0f9d3e-metrics-certs\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.002614 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-metrics-certs\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.002564 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64760ce4-85d6-4e58-aa77-99c1ca4d936e-cert\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.025770 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chlb\" (UniqueName: \"kubernetes.io/projected/64760ce4-85d6-4e58-aa77-99c1ca4d936e-kube-api-access-6chlb\") pod \"controller-6968d8fdc4-pwcrt\" (UID: \"64760ce4-85d6-4e58-aa77-99c1ca4d936e\") " pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.036591 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvw5\" (UniqueName: \"kubernetes.io/projected/ae8d6393-e53b-4acc-9a90-094d95e29c03-kube-api-access-qbvw5\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.048554 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwnj\" (UniqueName: \"kubernetes.io/projected/572a79b7-a042-4090-afdd-924cdb0f9d3e-kube-api-access-njwnj\") pod \"frr-k8s-bnlrj\" (UID: \"572a79b7-a042-4090-afdd-924cdb0f9d3e\") " pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.115274 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.332414 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.403083 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.407742 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e70dfea-db96-43f0-82ea-e9342326f82f-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-hd78b\" (UID: \"8e70dfea-db96-43f0-82ea-e9342326f82f\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.504879 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:54 crc kubenswrapper[4845]: E0202 10:47:54.505073 4845 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:47:54 crc kubenswrapper[4845]: E0202 10:47:54.505162 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist podName:ae8d6393-e53b-4acc-9a90-094d95e29c03 nodeName:}" failed. No retries permitted until 2026-02-02 10:47:55.505143159 +0000 UTC m=+956.596544609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist") pod "speaker-7gchc" (UID: "ae8d6393-e53b-4acc-9a90-094d95e29c03") : secret "metallb-memberlist" not found Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.557185 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-pwcrt"] Feb 02 10:47:54 crc kubenswrapper[4845]: W0202 10:47:54.560641 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64760ce4_85d6_4e58_aa77_99c1ca4d936e.slice/crio-0590a2a7a5a913d456dd8b46e87c22c6723322d20ca62211d00dde8f18125e73 WatchSource:0}: Error finding container 0590a2a7a5a913d456dd8b46e87c22c6723322d20ca62211d00dde8f18125e73: Status 404 returned error can't find the container with id 0590a2a7a5a913d456dd8b46e87c22c6723322d20ca62211d00dde8f18125e73 Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.615159 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.726852 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-pwcrt" event={"ID":"64760ce4-85d6-4e58-aa77-99c1ca4d936e","Type":"ContainerStarted","Data":"0f37d84c9ab71c9aff025dd9e38c6fc067a34c9092f676a7ae642d5595c9ec36"} Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.726917 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-pwcrt" event={"ID":"64760ce4-85d6-4e58-aa77-99c1ca4d936e","Type":"ContainerStarted","Data":"0590a2a7a5a913d456dd8b46e87c22c6723322d20ca62211d00dde8f18125e73"} Feb 02 10:47:54 crc kubenswrapper[4845]: I0202 10:47:54.728251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"1282d09a6fb51b25a80b5ae1aee4df725d28958f80cf933c202cf826d2971f7c"} Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.034690 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b"] Feb 02 10:47:55 crc kubenswrapper[4845]: W0202 10:47:55.039710 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e70dfea_db96_43f0_82ea_e9342326f82f.slice/crio-16e8987d7ff224e50854a360954e0f743645d7dddb9990ddd21357e90873934b WatchSource:0}: Error finding container 16e8987d7ff224e50854a360954e0f743645d7dddb9990ddd21357e90873934b: Status 404 returned error can't find the container with id 16e8987d7ff224e50854a360954e0f743645d7dddb9990ddd21357e90873934b Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.534875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.543652 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ae8d6393-e53b-4acc-9a90-094d95e29c03-memberlist\") pod \"speaker-7gchc\" (UID: \"ae8d6393-e53b-4acc-9a90-094d95e29c03\") " pod="metallb-system/speaker-7gchc" Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.601427 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7gchc" Feb 02 10:47:55 crc kubenswrapper[4845]: W0202 10:47:55.637942 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8d6393_e53b_4acc_9a90_094d95e29c03.slice/crio-c16768ffae7f824fee36ebde27b3a7d22804da52029443e75293fc9874eb1cfd WatchSource:0}: Error finding container c16768ffae7f824fee36ebde27b3a7d22804da52029443e75293fc9874eb1cfd: Status 404 returned error can't find the container with id c16768ffae7f824fee36ebde27b3a7d22804da52029443e75293fc9874eb1cfd Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.742780 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" event={"ID":"8e70dfea-db96-43f0-82ea-e9342326f82f","Type":"ContainerStarted","Data":"16e8987d7ff224e50854a360954e0f743645d7dddb9990ddd21357e90873934b"} Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.745069 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7gchc" event={"ID":"ae8d6393-e53b-4acc-9a90-094d95e29c03","Type":"ContainerStarted","Data":"c16768ffae7f824fee36ebde27b3a7d22804da52029443e75293fc9874eb1cfd"} Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.747635 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-pwcrt" event={"ID":"64760ce4-85d6-4e58-aa77-99c1ca4d936e","Type":"ContainerStarted","Data":"55f8094bf0584bd760bf6934118031442f40abcc685016c69d4a59df82a7cb61"} Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.747826 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:47:55 crc kubenswrapper[4845]: I0202 10:47:55.774374 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-pwcrt" podStartSLOduration=2.774357635 podStartE2EDuration="2.774357635s" podCreationTimestamp="2026-02-02 10:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:47:55.770239456 +0000 UTC m=+956.861640906" watchObservedRunningTime="2026-02-02 10:47:55.774357635 +0000 UTC m=+956.865759085" Feb 02 10:47:56 crc kubenswrapper[4845]: I0202 10:47:56.759383 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7gchc" event={"ID":"ae8d6393-e53b-4acc-9a90-094d95e29c03","Type":"ContainerStarted","Data":"fd5c93be43b68cf9b3886bf7c14e709a87c22b7bdf2d0e54202b05948a4b3757"} Feb 02 10:47:56 crc kubenswrapper[4845]: I0202 10:47:56.759885 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7gchc" event={"ID":"ae8d6393-e53b-4acc-9a90-094d95e29c03","Type":"ContainerStarted","Data":"af65a7560f2d423923aa7523416c3f4785fb0d3601d5b9b1f49139adaad97f48"} Feb 02 10:47:56 crc kubenswrapper[4845]: I0202 10:47:56.778926 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7gchc" podStartSLOduration=3.778910942 podStartE2EDuration="3.778910942s" podCreationTimestamp="2026-02-02 10:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:47:56.775989458 +0000 UTC m=+957.867390908" watchObservedRunningTime="2026-02-02 10:47:56.778910942 +0000 UTC m=+957.870312392" Feb 02 10:47:57 crc kubenswrapper[4845]: I0202 10:47:57.771371 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7gchc" Feb 02 10:48:02 crc kubenswrapper[4845]: I0202 10:48:02.819321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" event={"ID":"8e70dfea-db96-43f0-82ea-e9342326f82f","Type":"ContainerStarted","Data":"d4db9ffee14ab20682a8c10d4d24a9441bb64177eb56c2beaabbfad5d27f88ea"} Feb 02 10:48:02 crc kubenswrapper[4845]: I0202 10:48:02.821067 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:48:02 crc kubenswrapper[4845]: I0202 10:48:02.822833 4845 generic.go:334] "Generic (PLEG): container finished" podID="572a79b7-a042-4090-afdd-924cdb0f9d3e" containerID="eba3917b86c2ceaa0a457e3fac52b2ee3445d6d144761ce700a4bffb58c1c47b" exitCode=0 Feb 02 10:48:02 crc kubenswrapper[4845]: I0202 10:48:02.822898 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerDied","Data":"eba3917b86c2ceaa0a457e3fac52b2ee3445d6d144761ce700a4bffb58c1c47b"} Feb 02 10:48:02 crc kubenswrapper[4845]: I0202 10:48:02.865489 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" podStartSLOduration=2.872273567 podStartE2EDuration="9.865469082s" podCreationTimestamp="2026-02-02 10:47:53 +0000 UTC" firstStartedPulling="2026-02-02 10:47:55.042073306 +0000 UTC m=+956.133474746" lastFinishedPulling="2026-02-02 10:48:02.035268811 +0000 UTC m=+963.126670261" observedRunningTime="2026-02-02 10:48:02.859925752 +0000 UTC m=+963.951327222" watchObservedRunningTime="2026-02-02 10:48:02.865469082 +0000 UTC m=+963.956870532" Feb 02 10:48:03 crc kubenswrapper[4845]: I0202 10:48:03.835224 4845 generic.go:334] "Generic (PLEG): container finished" podID="572a79b7-a042-4090-afdd-924cdb0f9d3e" containerID="7cba66176bdc444500a8c03d75d83560a5ee92511b8bb153ed816bfd3931c548" exitCode=0 Feb 02 10:48:03 crc kubenswrapper[4845]: I0202 10:48:03.835339 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerDied","Data":"7cba66176bdc444500a8c03d75d83560a5ee92511b8bb153ed816bfd3931c548"} Feb 02 10:48:04 crc kubenswrapper[4845]: I0202 10:48:04.120174 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-pwcrt" Feb 02 10:48:04 crc kubenswrapper[4845]: I0202 10:48:04.850624 4845 generic.go:334] "Generic (PLEG): container finished" podID="572a79b7-a042-4090-afdd-924cdb0f9d3e" containerID="92701bbae76486ceada5b98e913d02e96c3dd644aedb1417e0da59fd7013429f" exitCode=0 Feb 02 10:48:04 crc kubenswrapper[4845]: I0202 10:48:04.850679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerDied","Data":"92701bbae76486ceada5b98e913d02e96c3dd644aedb1417e0da59fd7013429f"} Feb 02 10:48:05 crc kubenswrapper[4845]: I0202 10:48:05.606130 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7gchc" Feb 02 10:48:05 crc kubenswrapper[4845]: I0202 10:48:05.862263 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"37edb9b479622c8c0140e2c2cdac39cb28ecbc733191c5d98f0bd229229d0a91"} Feb 02 10:48:05 crc kubenswrapper[4845]: I0202 10:48:05.862327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"78bede3a13821b7fbd0fb83b38e617d4066b5a02f0265386a212402f1f41b073"} Feb 02 10:48:05 crc kubenswrapper[4845]: I0202 10:48:05.862339 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"117266be18c79e94fe17ed2d20b5bba474aafcdfe50bbe1168b07d2ef44090dd"} Feb 02 10:48:05 crc kubenswrapper[4845]: I0202 10:48:05.862350 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"30b2f98c8a597d889d21630394d8e21560ba0f8b39df5caea42a1289eff00e1c"} Feb 02 10:48:06 crc kubenswrapper[4845]: I0202 10:48:06.873969 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"cded894d172d26689377a2c8eea087c1d79f0fb300715702d9c0300e0ecaf3de"} Feb 02 10:48:06 crc kubenswrapper[4845]: I0202 10:48:06.874223 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:48:06 crc kubenswrapper[4845]: I0202 10:48:06.874235 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bnlrj" event={"ID":"572a79b7-a042-4090-afdd-924cdb0f9d3e","Type":"ContainerStarted","Data":"a6f9d2b4c8188edd880f27710dd6f0a89aa41b60cc3c5c46304a6f4a03b7d86a"} Feb 02 10:48:06 crc kubenswrapper[4845]: I0202 10:48:06.899838 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bnlrj" podStartSLOduration=6.394474408 podStartE2EDuration="13.899815526s" podCreationTimestamp="2026-02-02 10:47:53 +0000 UTC" firstStartedPulling="2026-02-02 10:47:54.49027555 +0000 UTC m=+955.581677000" lastFinishedPulling="2026-02-02 10:48:01.995616668 +0000 UTC m=+963.087018118" observedRunningTime="2026-02-02 10:48:06.893654279 +0000 UTC m=+967.985055749" watchObservedRunningTime="2026-02-02 10:48:06.899815526 +0000 UTC m=+967.991216976" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.213734 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.215041 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.224236 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jk5k7" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.224344 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.225033 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.226130 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.287731 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gcn9\" (UniqueName: \"kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9\") pod \"openstack-operator-index-5mv28\" (UID: \"25e84e3e-ad0d-4488-9be5-eba5934ff498\") " pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.389474 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gcn9\" (UniqueName: \"kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9\") pod \"openstack-operator-index-5mv28\" (UID: \"25e84e3e-ad0d-4488-9be5-eba5934ff498\") " pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.406540 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gcn9\" (UniqueName: \"kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9\") pod \"openstack-operator-index-5mv28\" (UID: \"25e84e3e-ad0d-4488-9be5-eba5934ff498\") " pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:08 crc kubenswrapper[4845]: I0202 10:48:08.560838 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:09 crc kubenswrapper[4845]: I0202 10:48:09.067372 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:09 crc kubenswrapper[4845]: W0202 10:48:09.077241 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e84e3e_ad0d_4488_9be5_eba5934ff498.slice/crio-ae6e4d74845b9b32c84b30cb4f0d76b0757673736a4a0c46e58e4f13036c9758 WatchSource:0}: Error finding container ae6e4d74845b9b32c84b30cb4f0d76b0757673736a4a0c46e58e4f13036c9758: Status 404 returned error can't find the container with id ae6e4d74845b9b32c84b30cb4f0d76b0757673736a4a0c46e58e4f13036c9758 Feb 02 10:48:09 crc kubenswrapper[4845]: I0202 10:48:09.333945 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:48:09 crc kubenswrapper[4845]: I0202 10:48:09.370001 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:48:09 crc kubenswrapper[4845]: I0202 10:48:09.899010 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5mv28" event={"ID":"25e84e3e-ad0d-4488-9be5-eba5934ff498","Type":"ContainerStarted","Data":"ae6e4d74845b9b32c84b30cb4f0d76b0757673736a4a0c46e58e4f13036c9758"} Feb 02 10:48:11 crc kubenswrapper[4845]: I0202 10:48:11.584138 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:11 crc kubenswrapper[4845]: I0202 10:48:11.924672 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5mv28" event={"ID":"25e84e3e-ad0d-4488-9be5-eba5934ff498","Type":"ContainerStarted","Data":"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2"} Feb 02 10:48:11 crc kubenswrapper[4845]: I0202 10:48:11.925706 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5mv28" podUID="25e84e3e-ad0d-4488-9be5-eba5934ff498" containerName="registry-server" containerID="cri-o://027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2" gracePeriod=2 Feb 02 10:48:11 crc kubenswrapper[4845]: I0202 10:48:11.945052 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5mv28" podStartSLOduration=1.303143055 podStartE2EDuration="3.94503806s" podCreationTimestamp="2026-02-02 10:48:08 +0000 UTC" firstStartedPulling="2026-02-02 10:48:09.079119197 +0000 UTC m=+970.170520647" lastFinishedPulling="2026-02-02 10:48:11.721014202 +0000 UTC m=+972.812415652" observedRunningTime="2026-02-02 10:48:11.944084462 +0000 UTC m=+973.035485922" watchObservedRunningTime="2026-02-02 10:48:11.94503806 +0000 UTC m=+973.036439510" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.200293 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-csz6h"] Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.201753 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.210865 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-csz6h"] Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.250308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2h6z\" (UniqueName: \"kubernetes.io/projected/153335e1-79de-4c5c-a3cd-2731d0998994-kube-api-access-q2h6z\") pod \"openstack-operator-index-csz6h\" (UID: \"153335e1-79de-4c5c-a3cd-2731d0998994\") " pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.351946 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5mv28_25e84e3e-ad0d-4488-9be5-eba5934ff498/registry-server/0.log" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.352028 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.352056 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2h6z\" (UniqueName: \"kubernetes.io/projected/153335e1-79de-4c5c-a3cd-2731d0998994-kube-api-access-q2h6z\") pod \"openstack-operator-index-csz6h\" (UID: \"153335e1-79de-4c5c-a3cd-2731d0998994\") " pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.379032 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2h6z\" (UniqueName: \"kubernetes.io/projected/153335e1-79de-4c5c-a3cd-2731d0998994-kube-api-access-q2h6z\") pod \"openstack-operator-index-csz6h\" (UID: \"153335e1-79de-4c5c-a3cd-2731d0998994\") " pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.524125 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.555565 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gcn9\" (UniqueName: \"kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9\") pod \"25e84e3e-ad0d-4488-9be5-eba5934ff498\" (UID: \"25e84e3e-ad0d-4488-9be5-eba5934ff498\") " Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.561629 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9" (OuterVolumeSpecName: "kube-api-access-4gcn9") pod "25e84e3e-ad0d-4488-9be5-eba5934ff498" (UID: "25e84e3e-ad0d-4488-9be5-eba5934ff498"). InnerVolumeSpecName "kube-api-access-4gcn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.659970 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gcn9\" (UniqueName: \"kubernetes.io/projected/25e84e3e-ad0d-4488-9be5-eba5934ff498-kube-api-access-4gcn9\") on node \"crc\" DevicePath \"\"" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.925410 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-csz6h"] Feb 02 10:48:12 crc kubenswrapper[4845]: W0202 10:48:12.929068 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153335e1_79de_4c5c_a3cd_2731d0998994.slice/crio-20ccf78a0ecf8b2b5fb62e705291bf8c64cc0d787fae4343bf78c7297cfa0db1 WatchSource:0}: Error finding container 20ccf78a0ecf8b2b5fb62e705291bf8c64cc0d787fae4343bf78c7297cfa0db1: Status 404 returned error can't find the container with id 20ccf78a0ecf8b2b5fb62e705291bf8c64cc0d787fae4343bf78c7297cfa0db1 Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937640 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-5mv28_25e84e3e-ad0d-4488-9be5-eba5934ff498/registry-server/0.log" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937694 4845 generic.go:334] "Generic (PLEG): container finished" podID="25e84e3e-ad0d-4488-9be5-eba5934ff498" containerID="027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2" exitCode=2 Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937728 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5mv28" event={"ID":"25e84e3e-ad0d-4488-9be5-eba5934ff498","Type":"ContainerDied","Data":"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2"} Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937745 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5mv28" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937756 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5mv28" event={"ID":"25e84e3e-ad0d-4488-9be5-eba5934ff498","Type":"ContainerDied","Data":"ae6e4d74845b9b32c84b30cb4f0d76b0757673736a4a0c46e58e4f13036c9758"} Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.937791 4845 scope.go:117] "RemoveContainer" containerID="027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.994096 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.997218 4845 scope.go:117] "RemoveContainer" containerID="027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2" Feb 02 10:48:12 crc kubenswrapper[4845]: E0202 10:48:12.997725 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2\": container with ID starting with 027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2 not found: ID does not exist" containerID="027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.997761 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2"} err="failed to get container status \"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2\": rpc error: code = NotFound desc = could not find container \"027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2\": container with ID starting with 027096357aef1997ef6b1c2ea65438d90bf6c7e0b0474e4fee2d5d6f58b26fe2 not found: ID does not exist" Feb 02 10:48:12 crc kubenswrapper[4845]: I0202 10:48:12.999277 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5mv28"] Feb 02 10:48:13 crc kubenswrapper[4845]: I0202 10:48:13.723201 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e84e3e-ad0d-4488-9be5-eba5934ff498" path="/var/lib/kubelet/pods/25e84e3e-ad0d-4488-9be5-eba5934ff498/volumes" Feb 02 10:48:13 crc kubenswrapper[4845]: I0202 10:48:13.950404 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-csz6h" event={"ID":"153335e1-79de-4c5c-a3cd-2731d0998994","Type":"ContainerStarted","Data":"e51159131a4297f98cd5e817d82e1b3c0174b4a4cf91d1ed71635b3f457fc652"} Feb 02 10:48:13 crc kubenswrapper[4845]: I0202 10:48:13.951149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-csz6h" event={"ID":"153335e1-79de-4c5c-a3cd-2731d0998994","Type":"ContainerStarted","Data":"20ccf78a0ecf8b2b5fb62e705291bf8c64cc0d787fae4343bf78c7297cfa0db1"} Feb 02 10:48:13 crc kubenswrapper[4845]: I0202 10:48:13.978744 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-csz6h" podStartSLOduration=1.9118989960000001 podStartE2EDuration="1.978725283s" podCreationTimestamp="2026-02-02 10:48:12 +0000 UTC" firstStartedPulling="2026-02-02 10:48:12.931963868 +0000 UTC m=+974.023365328" lastFinishedPulling="2026-02-02 10:48:12.998790155 +0000 UTC m=+974.090191615" observedRunningTime="2026-02-02 10:48:13.974098839 +0000 UTC m=+975.065500299" watchObservedRunningTime="2026-02-02 10:48:13.978725283 +0000 UTC m=+975.070126743" Feb 02 10:48:14 crc kubenswrapper[4845]: I0202 10:48:14.623282 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-hd78b" Feb 02 10:48:22 crc kubenswrapper[4845]: I0202 10:48:22.524953 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:22 crc kubenswrapper[4845]: I0202 10:48:22.525490 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:22 crc kubenswrapper[4845]: I0202 10:48:22.564879 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:23 crc kubenswrapper[4845]: I0202 10:48:23.065833 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-csz6h" Feb 02 10:48:24 crc kubenswrapper[4845]: I0202 10:48:24.341162 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bnlrj" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.041872 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf"] Feb 02 10:48:29 crc kubenswrapper[4845]: E0202 10:48:29.043009 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25e84e3e-ad0d-4488-9be5-eba5934ff498" containerName="registry-server" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.043029 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e84e3e-ad0d-4488-9be5-eba5934ff498" containerName="registry-server" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.043214 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="25e84e3e-ad0d-4488-9be5-eba5934ff498" containerName="registry-server" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.044730 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.048543 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-z6zzx" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.050316 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf"] Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.156726 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.156856 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ch42\" (UniqueName: \"kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.156982 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.258939 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ch42\" (UniqueName: \"kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.259099 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.259157 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.259734 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.259775 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.293048 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ch42\" (UniqueName: \"kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42\") pod \"a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.375158 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:29 crc kubenswrapper[4845]: I0202 10:48:29.784987 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf"] Feb 02 10:48:29 crc kubenswrapper[4845]: W0202 10:48:29.799173 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08c56222_38b1_47b8_b554_cc59e503ecf0.slice/crio-ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c WatchSource:0}: Error finding container ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c: Status 404 returned error can't find the container with id ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c Feb 02 10:48:30 crc kubenswrapper[4845]: I0202 10:48:30.100195 4845 generic.go:334] "Generic (PLEG): container finished" podID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerID="a42872033bfbe1c357ca946bba1ebae75f69a4321b58131901cbbbd890732208" exitCode=0 Feb 02 10:48:30 crc kubenswrapper[4845]: I0202 10:48:30.100239 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" event={"ID":"08c56222-38b1-47b8-b554-cc59e503ecf0","Type":"ContainerDied","Data":"a42872033bfbe1c357ca946bba1ebae75f69a4321b58131901cbbbd890732208"} Feb 02 10:48:30 crc kubenswrapper[4845]: I0202 10:48:30.100292 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" event={"ID":"08c56222-38b1-47b8-b554-cc59e503ecf0","Type":"ContainerStarted","Data":"ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c"} Feb 02 10:48:31 crc kubenswrapper[4845]: I0202 10:48:31.111106 4845 generic.go:334] "Generic (PLEG): container finished" podID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerID="6435b21e3a2e68b60ab2550011b6d5a53c97998775b96c5e8d59875d3ef7907c" exitCode=0 Feb 02 10:48:31 crc kubenswrapper[4845]: I0202 10:48:31.111281 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" event={"ID":"08c56222-38b1-47b8-b554-cc59e503ecf0","Type":"ContainerDied","Data":"6435b21e3a2e68b60ab2550011b6d5a53c97998775b96c5e8d59875d3ef7907c"} Feb 02 10:48:32 crc kubenswrapper[4845]: I0202 10:48:32.125059 4845 generic.go:334] "Generic (PLEG): container finished" podID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerID="21f8686de02b943f94e9bed25d53b1f3cad9303746af566197cea4c7dbccf224" exitCode=0 Feb 02 10:48:32 crc kubenswrapper[4845]: I0202 10:48:32.125983 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" event={"ID":"08c56222-38b1-47b8-b554-cc59e503ecf0","Type":"ContainerDied","Data":"21f8686de02b943f94e9bed25d53b1f3cad9303746af566197cea4c7dbccf224"} Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.451248 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.631444 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle\") pod \"08c56222-38b1-47b8-b554-cc59e503ecf0\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.631563 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ch42\" (UniqueName: \"kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42\") pod \"08c56222-38b1-47b8-b554-cc59e503ecf0\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.631811 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util\") pod \"08c56222-38b1-47b8-b554-cc59e503ecf0\" (UID: \"08c56222-38b1-47b8-b554-cc59e503ecf0\") " Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.632519 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle" (OuterVolumeSpecName: "bundle") pod "08c56222-38b1-47b8-b554-cc59e503ecf0" (UID: "08c56222-38b1-47b8-b554-cc59e503ecf0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.639069 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42" (OuterVolumeSpecName: "kube-api-access-8ch42") pod "08c56222-38b1-47b8-b554-cc59e503ecf0" (UID: "08c56222-38b1-47b8-b554-cc59e503ecf0"). InnerVolumeSpecName "kube-api-access-8ch42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.645853 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util" (OuterVolumeSpecName: "util") pod "08c56222-38b1-47b8-b554-cc59e503ecf0" (UID: "08c56222-38b1-47b8-b554-cc59e503ecf0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.733091 4845 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.733117 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ch42\" (UniqueName: \"kubernetes.io/projected/08c56222-38b1-47b8-b554-cc59e503ecf0-kube-api-access-8ch42\") on node \"crc\" DevicePath \"\"" Feb 02 10:48:33 crc kubenswrapper[4845]: I0202 10:48:33.733126 4845 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08c56222-38b1-47b8-b554-cc59e503ecf0-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:48:34 crc kubenswrapper[4845]: I0202 10:48:34.145348 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" event={"ID":"08c56222-38b1-47b8-b554-cc59e503ecf0","Type":"ContainerDied","Data":"ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c"} Feb 02 10:48:34 crc kubenswrapper[4845]: I0202 10:48:34.145414 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff167cc736960a1a975e6b21bf4059352ae59016da5e278f72ff2083ca38b08c" Feb 02 10:48:34 crc kubenswrapper[4845]: I0202 10:48:34.146012 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.730409 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8"] Feb 02 10:48:41 crc kubenswrapper[4845]: E0202 10:48:41.731280 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="pull" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.731294 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="pull" Feb 02 10:48:41 crc kubenswrapper[4845]: E0202 10:48:41.731308 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="util" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.731315 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="util" Feb 02 10:48:41 crc kubenswrapper[4845]: E0202 10:48:41.731324 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="extract" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.731330 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="extract" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.731506 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c56222-38b1-47b8-b554-cc59e503ecf0" containerName="extract" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.732271 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.735948 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-96d9k" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.759699 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8"] Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.882136 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bsh\" (UniqueName: \"kubernetes.io/projected/e693a9f1-6990-407e-9d01-a23428a6f602-kube-api-access-d7bsh\") pod \"openstack-operator-controller-init-5649bd689f-k5lt8\" (UID: \"e693a9f1-6990-407e-9d01-a23428a6f602\") " pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:41 crc kubenswrapper[4845]: I0202 10:48:41.984191 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bsh\" (UniqueName: \"kubernetes.io/projected/e693a9f1-6990-407e-9d01-a23428a6f602-kube-api-access-d7bsh\") pod \"openstack-operator-controller-init-5649bd689f-k5lt8\" (UID: \"e693a9f1-6990-407e-9d01-a23428a6f602\") " pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:42 crc kubenswrapper[4845]: I0202 10:48:42.004187 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bsh\" (UniqueName: \"kubernetes.io/projected/e693a9f1-6990-407e-9d01-a23428a6f602-kube-api-access-d7bsh\") pod \"openstack-operator-controller-init-5649bd689f-k5lt8\" (UID: \"e693a9f1-6990-407e-9d01-a23428a6f602\") " pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:42 crc kubenswrapper[4845]: I0202 10:48:42.054517 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:42 crc kubenswrapper[4845]: I0202 10:48:42.529301 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8"] Feb 02 10:48:43 crc kubenswrapper[4845]: I0202 10:48:43.276684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" event={"ID":"e693a9f1-6990-407e-9d01-a23428a6f602","Type":"ContainerStarted","Data":"fdc0e5d437c7da9a195355cf6fb416e18238f17412a4d536ae6c930273c7166d"} Feb 02 10:48:46 crc kubenswrapper[4845]: I0202 10:48:46.300981 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" event={"ID":"e693a9f1-6990-407e-9d01-a23428a6f602","Type":"ContainerStarted","Data":"eac0f8cadc1ccb0712626005360c11ed246f61bf281e6a1c7de2d446f138bb99"} Feb 02 10:48:46 crc kubenswrapper[4845]: I0202 10:48:46.301444 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:48:46 crc kubenswrapper[4845]: I0202 10:48:46.337053 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" podStartSLOduration=1.814675334 podStartE2EDuration="5.337035669s" podCreationTimestamp="2026-02-02 10:48:41 +0000 UTC" firstStartedPulling="2026-02-02 10:48:42.551458896 +0000 UTC m=+1003.642860346" lastFinishedPulling="2026-02-02 10:48:46.073819231 +0000 UTC m=+1007.165220681" observedRunningTime="2026-02-02 10:48:46.330638624 +0000 UTC m=+1007.422040074" watchObservedRunningTime="2026-02-02 10:48:46.337035669 +0000 UTC m=+1007.428437119" Feb 02 10:48:52 crc kubenswrapper[4845]: I0202 10:48:52.058101 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5649bd689f-k5lt8" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.722660 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.725384 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.725482 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.726466 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.728283 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.729448 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.731519 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wlznx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.732250 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-49jxh" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.732601 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-tg4bb" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.740506 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.748729 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.758834 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.760471 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.765301 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2tgn8" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.804293 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.811651 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.840878 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvvn\" (UniqueName: \"kubernetes.io/projected/efa2be30-a7d0-4b26-865a-58448de203a0-kube-api-access-wvvvn\") pod \"designate-operator-controller-manager-6d9697b7f4-qfprx\" (UID: \"efa2be30-a7d0-4b26-865a-58448de203a0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.841485 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmlw2\" (UniqueName: \"kubernetes.io/projected/85439e8a-f7d3-4e0b-827c-bf27e8cd53dd-kube-api-access-gmlw2\") pod \"glance-operator-controller-manager-8886f4c47-9c2wv\" (UID: \"85439e8a-f7d3-4e0b-827c-bf27e8cd53dd\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.841815 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76zt4\" (UniqueName: \"kubernetes.io/projected/d9196fe1-4a04-44c1-9a5f-1ad5de52da7f-kube-api-access-76zt4\") pod \"cinder-operator-controller-manager-8d874c8fc-c4jdf\" (UID: \"d9196fe1-4a04-44c1-9a5f-1ad5de52da7f\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.841917 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvh4s\" (UniqueName: \"kubernetes.io/projected/202de28c-c44a-43d9-98fd-4b34b1dcc65f-kube-api-access-mvh4s\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-cfvq7\" (UID: \"202de28c-c44a-43d9-98fd-4b34b1dcc65f\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.847727 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.848760 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.853229 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-qr5xp" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.859916 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.861164 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.864166 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vtcvs" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.865833 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.917270 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.941934 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-m55cc"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.943766 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944585 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qvb\" (UniqueName: \"kubernetes.io/projected/745626d8-548b-43bb-aee8-eeab34a86427-kube-api-access-t8qvb\") pod \"heat-operator-controller-manager-69d6db494d-pdfcx\" (UID: \"745626d8-548b-43bb-aee8-eeab34a86427\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944647 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvvn\" (UniqueName: \"kubernetes.io/projected/efa2be30-a7d0-4b26-865a-58448de203a0-kube-api-access-wvvvn\") pod \"designate-operator-controller-manager-6d9697b7f4-qfprx\" (UID: \"efa2be30-a7d0-4b26-865a-58448de203a0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944673 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7k9\" (UniqueName: \"kubernetes.io/projected/1b72ed0e-9df5-459f-8ca9-de19874a3018-kube-api-access-6x7k9\") pod \"horizon-operator-controller-manager-5fb775575f-msjcj\" (UID: \"1b72ed0e-9df5-459f-8ca9-de19874a3018\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944698 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmlw2\" (UniqueName: \"kubernetes.io/projected/85439e8a-f7d3-4e0b-827c-bf27e8cd53dd-kube-api-access-gmlw2\") pod \"glance-operator-controller-manager-8886f4c47-9c2wv\" (UID: \"85439e8a-f7d3-4e0b-827c-bf27e8cd53dd\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944725 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76zt4\" (UniqueName: \"kubernetes.io/projected/d9196fe1-4a04-44c1-9a5f-1ad5de52da7f-kube-api-access-76zt4\") pod \"cinder-operator-controller-manager-8d874c8fc-c4jdf\" (UID: \"d9196fe1-4a04-44c1-9a5f-1ad5de52da7f\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.944745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvh4s\" (UniqueName: \"kubernetes.io/projected/202de28c-c44a-43d9-98fd-4b34b1dcc65f-kube-api-access-mvh4s\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-cfvq7\" (UID: \"202de28c-c44a-43d9-98fd-4b34b1dcc65f\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.954954 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.956506 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.956752 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.956931 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzl6d" Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.979364 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw"] Feb 02 10:49:29 crc kubenswrapper[4845]: I0202 10:49:29.979617 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mvt2n" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.029045 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvvn\" (UniqueName: \"kubernetes.io/projected/efa2be30-a7d0-4b26-865a-58448de203a0-kube-api-access-wvvvn\") pod \"designate-operator-controller-manager-6d9697b7f4-qfprx\" (UID: \"efa2be30-a7d0-4b26-865a-58448de203a0\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.029666 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76zt4\" (UniqueName: \"kubernetes.io/projected/d9196fe1-4a04-44c1-9a5f-1ad5de52da7f-kube-api-access-76zt4\") pod \"cinder-operator-controller-manager-8d874c8fc-c4jdf\" (UID: \"d9196fe1-4a04-44c1-9a5f-1ad5de52da7f\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.030259 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvh4s\" (UniqueName: \"kubernetes.io/projected/202de28c-c44a-43d9-98fd-4b34b1dcc65f-kube-api-access-mvh4s\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-cfvq7\" (UID: \"202de28c-c44a-43d9-98fd-4b34b1dcc65f\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.031648 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmlw2\" (UniqueName: \"kubernetes.io/projected/85439e8a-f7d3-4e0b-827c-bf27e8cd53dd-kube-api-access-gmlw2\") pod \"glance-operator-controller-manager-8886f4c47-9c2wv\" (UID: \"85439e8a-f7d3-4e0b-827c-bf27e8cd53dd\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.072099 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.072836 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7k9\" (UniqueName: \"kubernetes.io/projected/1b72ed0e-9df5-459f-8ca9-de19874a3018-kube-api-access-6x7k9\") pod \"horizon-operator-controller-manager-5fb775575f-msjcj\" (UID: \"1b72ed0e-9df5-459f-8ca9-de19874a3018\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.073097 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qvb\" (UniqueName: \"kubernetes.io/projected/745626d8-548b-43bb-aee8-eeab34a86427-kube-api-access-t8qvb\") pod \"heat-operator-controller-manager-69d6db494d-pdfcx\" (UID: \"745626d8-548b-43bb-aee8-eeab34a86427\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.073195 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.073394 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.122263 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.125935 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.129541 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-m55cc"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.130190 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.131159 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-njkgq" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.131448 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qvb\" (UniqueName: \"kubernetes.io/projected/745626d8-548b-43bb-aee8-eeab34a86427-kube-api-access-t8qvb\") pod \"heat-operator-controller-manager-69d6db494d-pdfcx\" (UID: \"745626d8-548b-43bb-aee8-eeab34a86427\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.139260 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7k9\" (UniqueName: \"kubernetes.io/projected/1b72ed0e-9df5-459f-8ca9-de19874a3018-kube-api-access-6x7k9\") pod \"horizon-operator-controller-manager-5fb775575f-msjcj\" (UID: \"1b72ed0e-9df5-459f-8ca9-de19874a3018\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.171746 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.172912 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.176747 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nqqb6" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.176866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r8tn\" (UniqueName: \"kubernetes.io/projected/36101b5e-a4ec-42b8-bb19-1cd2df2897c6-kube-api-access-2r8tn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bc2xw\" (UID: \"36101b5e-a4ec-42b8-bb19-1cd2df2897c6\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.177024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwqw\" (UniqueName: \"kubernetes.io/projected/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-kube-api-access-qlwqw\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.177046 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.188371 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.202992 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.208471 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.216284 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.228485 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.229848 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.242876 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.243977 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.259015 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.259080 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.266278 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.268014 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.273761 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-f84th" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.274028 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-dpfq6" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.274188 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.274271 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-f86kq" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.278692 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwqw\" (UniqueName: \"kubernetes.io/projected/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-kube-api-access-qlwqw\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.278732 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f46rz\" (UniqueName: \"kubernetes.io/projected/7cc6d028-e9d2-459c-b34c-d069917832a4-kube-api-access-f46rz\") pod \"keystone-operator-controller-manager-84f48565d4-w55l4\" (UID: \"7cc6d028-e9d2-459c-b34c-d069917832a4\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.278754 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.278808 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r8tn\" (UniqueName: \"kubernetes.io/projected/36101b5e-a4ec-42b8-bb19-1cd2df2897c6-kube-api-access-2r8tn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bc2xw\" (UID: \"36101b5e-a4ec-42b8-bb19-1cd2df2897c6\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.278855 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5ml\" (UniqueName: \"kubernetes.io/projected/de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d-kube-api-access-4h5ml\") pod \"manila-operator-controller-manager-7dd968899f-plj9z\" (UID: \"de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:49:30 crc kubenswrapper[4845]: E0202 10:49:30.279261 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:30 crc kubenswrapper[4845]: E0202 10:49:30.279306 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:30.779290296 +0000 UTC m=+1051.870691746 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.281941 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.287131 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.291462 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-bfv5p" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.300514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwqw\" (UniqueName: \"kubernetes.io/projected/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-kube-api-access-qlwqw\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.302620 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r8tn\" (UniqueName: \"kubernetes.io/projected/36101b5e-a4ec-42b8-bb19-1cd2df2897c6-kube-api-access-2r8tn\") pod \"ironic-operator-controller-manager-5f4b8bd54d-bc2xw\" (UID: \"36101b5e-a4ec-42b8-bb19-1cd2df2897c6\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.305480 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.313593 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.314667 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.316656 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-mr87t" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.324243 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.325487 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.328819 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-q7rbb" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.328985 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.337816 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.339090 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.343861 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7g9q2" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.351631 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.365728 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.380839 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v26qh\" (UniqueName: \"kubernetes.io/projected/568bf546-0674-4dbd-91d8-9497c682e368-kube-api-access-v26qh\") pod \"mariadb-operator-controller-manager-67bf948998-t89t9\" (UID: \"568bf546-0674-4dbd-91d8-9497c682e368\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.380900 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5ml\" (UniqueName: \"kubernetes.io/projected/de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d-kube-api-access-4h5ml\") pod \"manila-operator-controller-manager-7dd968899f-plj9z\" (UID: \"de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.380988 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbq5x\" (UniqueName: \"kubernetes.io/projected/a1c4a4d1-3974-47c1-9efc-ee88a38e13a5-kube-api-access-sbq5x\") pod \"neutron-operator-controller-manager-585dbc889-hnt9f\" (UID: \"a1c4a4d1-3974-47c1-9efc-ee88a38e13a5\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.381007 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f46rz\" (UniqueName: \"kubernetes.io/projected/7cc6d028-e9d2-459c-b34c-d069917832a4-kube-api-access-f46rz\") pod \"keystone-operator-controller-manager-84f48565d4-w55l4\" (UID: \"7cc6d028-e9d2-459c-b34c-d069917832a4\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.381074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjjb\" (UniqueName: \"kubernetes.io/projected/cac06f19-af65-481d-b739-68375e8d2968-kube-api-access-jtjjb\") pod \"nova-operator-controller-manager-55bff696bd-p98cd\" (UID: \"cac06f19-af65-481d-b739-68375e8d2968\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.392965 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.398348 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.407718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5ml\" (UniqueName: \"kubernetes.io/projected/de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d-kube-api-access-4h5ml\") pod \"manila-operator-controller-manager-7dd968899f-plj9z\" (UID: \"de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.409260 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.410059 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-4j54l" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.414050 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f46rz\" (UniqueName: \"kubernetes.io/projected/7cc6d028-e9d2-459c-b34c-d069917832a4-kube-api-access-f46rz\") pod \"keystone-operator-controller-manager-84f48565d4-w55l4\" (UID: \"7cc6d028-e9d2-459c-b34c-d069917832a4\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.422116 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.423215 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.426674 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-z27q9" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.436709 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.446399 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.468449 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.469831 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.474869 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pmj6d" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.477519 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rfm\" (UniqueName: \"kubernetes.io/projected/146aa38c-b63c-485a-9c55-006031cfcaa0-kube-api-access-r8rfm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482615 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482640 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wp7\" (UniqueName: \"kubernetes.io/projected/30843195-75a4-4b59-9193-dacda845ace7-kube-api-access-j7wp7\") pod \"octavia-operator-controller-manager-6687f8d877-s97rq\" (UID: \"30843195-75a4-4b59-9193-dacda845ace7\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482666 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v26qh\" (UniqueName: \"kubernetes.io/projected/568bf546-0674-4dbd-91d8-9497c682e368-kube-api-access-v26qh\") pod \"mariadb-operator-controller-manager-67bf948998-t89t9\" (UID: \"568bf546-0674-4dbd-91d8-9497c682e368\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482707 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk98v\" (UniqueName: \"kubernetes.io/projected/70403789-9865-4c4d-a969-118a157e564e-kube-api-access-sk98v\") pod \"placement-operator-controller-manager-5b964cf4cd-9ltr5\" (UID: \"70403789-9865-4c4d-a969-118a157e564e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482746 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47qqg\" (UniqueName: \"kubernetes.io/projected/eae9c104-9193-4404-b25a-3a47932ef374-kube-api-access-47qqg\") pod \"ovn-operator-controller-manager-788c46999f-lt2wj\" (UID: \"eae9c104-9193-4404-b25a-3a47932ef374\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482803 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbq5x\" (UniqueName: \"kubernetes.io/projected/a1c4a4d1-3974-47c1-9efc-ee88a38e13a5-kube-api-access-sbq5x\") pod \"neutron-operator-controller-manager-585dbc889-hnt9f\" (UID: \"a1c4a4d1-3974-47c1-9efc-ee88a38e13a5\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.482875 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjjb\" (UniqueName: \"kubernetes.io/projected/cac06f19-af65-481d-b739-68375e8d2968-kube-api-access-jtjjb\") pod \"nova-operator-controller-manager-55bff696bd-p98cd\" (UID: \"cac06f19-af65-481d-b739-68375e8d2968\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.494952 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-mkrpp"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.496021 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.503574 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rqwwx" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.504477 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-mkrpp"] Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.509197 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbq5x\" (UniqueName: \"kubernetes.io/projected/a1c4a4d1-3974-47c1-9efc-ee88a38e13a5-kube-api-access-sbq5x\") pod \"neutron-operator-controller-manager-585dbc889-hnt9f\" (UID: \"a1c4a4d1-3974-47c1-9efc-ee88a38e13a5\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.509687 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v26qh\" (UniqueName: \"kubernetes.io/projected/568bf546-0674-4dbd-91d8-9497c682e368-kube-api-access-v26qh\") pod \"mariadb-operator-controller-manager-67bf948998-t89t9\" (UID: \"568bf546-0674-4dbd-91d8-9497c682e368\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:49:30 crc kubenswrapper[4845]: I0202 10:49:30.512371 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjjb\" (UniqueName: \"kubernetes.io/projected/cac06f19-af65-481d-b739-68375e8d2968-kube-api-access-jtjjb\") pod \"nova-operator-controller-manager-55bff696bd-p98cd\" (UID: \"cac06f19-af65-481d-b739-68375e8d2968\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.093999 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.117999 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.130211 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.130956 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.132717 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.128277 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ng7k\" (UniqueName: \"kubernetes.io/projected/09ccace8-b972-48ae-a15d-ecf88a300105-kube-api-access-9ng7k\") pod \"test-operator-controller-manager-56f8bfcd9f-5f4q4\" (UID: \"09ccace8-b972-48ae-a15d-ecf88a300105\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157703 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157794 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvw4x\" (UniqueName: \"kubernetes.io/projected/1cec8fc8-2b7b-4332-92a4-05483486f925-kube-api-access-jvw4x\") pod \"telemetry-operator-controller-manager-6bbb97ddc6-fx4tn\" (UID: \"1cec8fc8-2b7b-4332-92a4-05483486f925\") " pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157877 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rfm\" (UniqueName: \"kubernetes.io/projected/146aa38c-b63c-485a-9c55-006031cfcaa0-kube-api-access-r8rfm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157925 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157952 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnjj\" (UniqueName: \"kubernetes.io/projected/817413ef-6c47-47ec-8e08-8dffd27c1e11-kube-api-access-wtnjj\") pod \"swift-operator-controller-manager-68fc8c869-w7bxj\" (UID: \"817413ef-6c47-47ec-8e08-8dffd27c1e11\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.157984 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wp7\" (UniqueName: \"kubernetes.io/projected/30843195-75a4-4b59-9193-dacda845ace7-kube-api-access-j7wp7\") pod \"octavia-operator-controller-manager-6687f8d877-s97rq\" (UID: \"30843195-75a4-4b59-9193-dacda845ace7\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.158065 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk98v\" (UniqueName: \"kubernetes.io/projected/70403789-9865-4c4d-a969-118a157e564e-kube-api-access-sk98v\") pod \"placement-operator-controller-manager-5b964cf4cd-9ltr5\" (UID: \"70403789-9865-4c4d-a969-118a157e564e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.158110 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47qqg\" (UniqueName: \"kubernetes.io/projected/eae9c104-9193-4404-b25a-3a47932ef374-kube-api-access-47qqg\") pod \"ovn-operator-controller-manager-788c46999f-lt2wj\" (UID: \"eae9c104-9193-4404-b25a-3a47932ef374\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.162975 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.163040 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:32.163020646 +0000 UTC m=+1053.254422096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.164453 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.164507 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert podName:146aa38c-b63c-485a-9c55-006031cfcaa0 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:31.664492477 +0000 UTC m=+1052.755893927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" (UID: "146aa38c-b63c-485a-9c55-006031cfcaa0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.167414 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.219718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rfm\" (UniqueName: \"kubernetes.io/projected/146aa38c-b63c-485a-9c55-006031cfcaa0-kube-api-access-r8rfm\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.228514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk98v\" (UniqueName: \"kubernetes.io/projected/70403789-9865-4c4d-a969-118a157e564e-kube-api-access-sk98v\") pod \"placement-operator-controller-manager-5b964cf4cd-9ltr5\" (UID: \"70403789-9865-4c4d-a969-118a157e564e\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.236024 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47qqg\" (UniqueName: \"kubernetes.io/projected/eae9c104-9193-4404-b25a-3a47932ef374-kube-api-access-47qqg\") pod \"ovn-operator-controller-manager-788c46999f-lt2wj\" (UID: \"eae9c104-9193-4404-b25a-3a47932ef374\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.249685 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wp7\" (UniqueName: \"kubernetes.io/projected/30843195-75a4-4b59-9193-dacda845ace7-kube-api-access-j7wp7\") pod \"octavia-operator-controller-manager-6687f8d877-s97rq\" (UID: \"30843195-75a4-4b59-9193-dacda845ace7\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.262539 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnjj\" (UniqueName: \"kubernetes.io/projected/817413ef-6c47-47ec-8e08-8dffd27c1e11-kube-api-access-wtnjj\") pod \"swift-operator-controller-manager-68fc8c869-w7bxj\" (UID: \"817413ef-6c47-47ec-8e08-8dffd27c1e11\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.271493 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zpd\" (UniqueName: \"kubernetes.io/projected/5d4eb1a9-137a-4959-9d37-d81ee9c6dd54-kube-api-access-b6zpd\") pod \"watcher-operator-controller-manager-564965969-mkrpp\" (UID: \"5d4eb1a9-137a-4959-9d37-d81ee9c6dd54\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.271816 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ng7k\" (UniqueName: \"kubernetes.io/projected/09ccace8-b972-48ae-a15d-ecf88a300105-kube-api-access-9ng7k\") pod \"test-operator-controller-manager-56f8bfcd9f-5f4q4\" (UID: \"09ccace8-b972-48ae-a15d-ecf88a300105\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.272060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvw4x\" (UniqueName: \"kubernetes.io/projected/1cec8fc8-2b7b-4332-92a4-05483486f925-kube-api-access-jvw4x\") pod \"telemetry-operator-controller-manager-6bbb97ddc6-fx4tn\" (UID: \"1cec8fc8-2b7b-4332-92a4-05483486f925\") " pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.313210 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnjj\" (UniqueName: \"kubernetes.io/projected/817413ef-6c47-47ec-8e08-8dffd27c1e11-kube-api-access-wtnjj\") pod \"swift-operator-controller-manager-68fc8c869-w7bxj\" (UID: \"817413ef-6c47-47ec-8e08-8dffd27c1e11\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.324838 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.324914 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvw4x\" (UniqueName: \"kubernetes.io/projected/1cec8fc8-2b7b-4332-92a4-05483486f925-kube-api-access-jvw4x\") pod \"telemetry-operator-controller-manager-6bbb97ddc6-fx4tn\" (UID: \"1cec8fc8-2b7b-4332-92a4-05483486f925\") " pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.328662 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ng7k\" (UniqueName: \"kubernetes.io/projected/09ccace8-b972-48ae-a15d-ecf88a300105-kube-api-access-9ng7k\") pod \"test-operator-controller-manager-56f8bfcd9f-5f4q4\" (UID: \"09ccace8-b972-48ae-a15d-ecf88a300105\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.338456 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.355665 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z"] Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.361315 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.364416 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.372305 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.372644 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zbcwm" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.372818 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.374360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zpd\" (UniqueName: \"kubernetes.io/projected/5d4eb1a9-137a-4959-9d37-d81ee9c6dd54-kube-api-access-b6zpd\") pod \"watcher-operator-controller-manager-564965969-mkrpp\" (UID: \"5d4eb1a9-137a-4959-9d37-d81ee9c6dd54\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.375762 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.395012 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.398134 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z"] Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.410910 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zpd\" (UniqueName: \"kubernetes.io/projected/5d4eb1a9-137a-4959-9d37-d81ee9c6dd54-kube-api-access-b6zpd\") pod \"watcher-operator-controller-manager-564965969-mkrpp\" (UID: \"5d4eb1a9-137a-4959-9d37-d81ee9c6dd54\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.414856 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.425340 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.451165 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l"] Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.452272 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.457387 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j652m" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.475936 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.476109 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.476179 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bvrt\" (UniqueName: \"kubernetes.io/projected/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-kube-api-access-7bvrt\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.476207 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vgz\" (UniqueName: \"kubernetes.io/projected/39f98254-3b87-4ac2-be8c-7d7a0f29d6ce-kube-api-access-f4vgz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5f9l\" (UID: \"39f98254-3b87-4ac2-be8c-7d7a0f29d6ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.494964 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l"] Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.578626 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.578712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bvrt\" (UniqueName: \"kubernetes.io/projected/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-kube-api-access-7bvrt\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.578743 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vgz\" (UniqueName: \"kubernetes.io/projected/39f98254-3b87-4ac2-be8c-7d7a0f29d6ce-kube-api-access-f4vgz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5f9l\" (UID: \"39f98254-3b87-4ac2-be8c-7d7a0f29d6ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.578786 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.579010 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.579058 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:32.079041919 +0000 UTC m=+1053.170443369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.579313 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.579340 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:32.079330607 +0000 UTC m=+1053.170732057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.603503 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bvrt\" (UniqueName: \"kubernetes.io/projected/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-kube-api-access-7bvrt\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.616418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vgz\" (UniqueName: \"kubernetes.io/projected/39f98254-3b87-4ac2-be8c-7d7a0f29d6ce-kube-api-access-f4vgz\") pod \"rabbitmq-cluster-operator-manager-668c99d594-z5f9l\" (UID: \"39f98254-3b87-4ac2-be8c-7d7a0f29d6ce\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.678995 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.680908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.681091 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: E0202 10:49:31.681152 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert podName:146aa38c-b63c-485a-9c55-006031cfcaa0 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:32.681137966 +0000 UTC m=+1053.772539416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" (UID: "146aa38c-b63c-485a-9c55-006031cfcaa0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:31 crc kubenswrapper[4845]: I0202 10:49:31.818038 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.096985 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.097328 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.097154 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.097459 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:33.097446628 +0000 UTC m=+1054.188848078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.097415 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.097488 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:33.097483129 +0000 UTC m=+1054.188884579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.173521 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" event={"ID":"202de28c-c44a-43d9-98fd-4b34b1dcc65f","Type":"ContainerStarted","Data":"f8b4a0f004c178a2aa77dcd6926a200f42d1037c247fa28982e04afad63f5a2d"} Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.198460 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.198631 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.198688 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:34.19867192 +0000 UTC m=+1055.290073370 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.570208 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.580256 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.593751 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.606443 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.614129 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx"] Feb 02 10:49:32 crc kubenswrapper[4845]: W0202 10:49:32.624206 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85439e8a_f7d3_4e0b_827c_bf27e8cd53dd.slice/crio-3536de7882258acaf9a1249f0cb1acfc21a30d7bc8edc1e088ab6e8c5750e4a9 WatchSource:0}: Error finding container 3536de7882258acaf9a1249f0cb1acfc21a30d7bc8edc1e088ab6e8c5750e4a9: Status 404 returned error can't find the container with id 3536de7882258acaf9a1249f0cb1acfc21a30d7bc8edc1e088ab6e8c5750e4a9 Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.707838 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.708329 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: E0202 10:49:32.708469 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert podName:146aa38c-b63c-485a-9c55-006031cfcaa0 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:34.708454406 +0000 UTC m=+1055.799855856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" (UID: "146aa38c-b63c-485a-9c55-006031cfcaa0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.923308 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.933468 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.941137 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.953063 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5"] Feb 02 10:49:32 crc kubenswrapper[4845]: I0202 10:49:32.963527 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.127445 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.127594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.127584 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.127715 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:35.127697021 +0000 UTC m=+1056.219098471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.127642 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.127768 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:35.127755973 +0000 UTC m=+1056.219157423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.183855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" event={"ID":"85439e8a-f7d3-4e0b-827c-bf27e8cd53dd","Type":"ContainerStarted","Data":"3536de7882258acaf9a1249f0cb1acfc21a30d7bc8edc1e088ab6e8c5750e4a9"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.185427 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" event={"ID":"7cc6d028-e9d2-459c-b34c-d069917832a4","Type":"ContainerStarted","Data":"1da94ec7fa8d97577de950d25dbbbead4b863594dcc1dc46e05d40acb827ca01"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.187182 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" event={"ID":"d9196fe1-4a04-44c1-9a5f-1ad5de52da7f","Type":"ContainerStarted","Data":"abcf9ef4634be149814be650705e77d201e4cb1dcd2c144c42fcddd6c090fe3a"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.189480 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" event={"ID":"de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d","Type":"ContainerStarted","Data":"22ad1ac3e37e1a539566f8dabdfb1b6a2e59059917dc8b649710689fa10a33d6"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.190808 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" event={"ID":"cac06f19-af65-481d-b739-68375e8d2968","Type":"ContainerStarted","Data":"5b3cd5202f49cc911c0b5cdffd1a124860a826fdc7cba2403efc0432d7fa7e99"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.191942 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" event={"ID":"745626d8-548b-43bb-aee8-eeab34a86427","Type":"ContainerStarted","Data":"fa43a304dfcd547829c0520a2fb4c37e3936a85dcddcf9a4d61fd7f373e18be9"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.193236 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" event={"ID":"1b72ed0e-9df5-459f-8ca9-de19874a3018","Type":"ContainerStarted","Data":"1c5b24bdeab4cb6b2e01273e2d23112229bac17f3cb9b14f8dbe859478fd1518"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.194474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" event={"ID":"36101b5e-a4ec-42b8-bb19-1cd2df2897c6","Type":"ContainerStarted","Data":"321f6ef88440cf52334d685e13ed4823ecad5de5b967eb68b8b192fdf85640df"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.195824 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" event={"ID":"efa2be30-a7d0-4b26-865a-58448de203a0","Type":"ContainerStarted","Data":"c91391f9b840e9b6334a5ea84142ea7b0c1d1430f247cee4e42e0a286708b06c"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.196988 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" event={"ID":"70403789-9865-4c4d-a969-118a157e564e","Type":"ContainerStarted","Data":"00e24f6e7e68bf059a0fdb580752617aa2c4ea0c52f7aa11fa4a126f2201d010"} Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.519195 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.537029 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.550977 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-mkrpp"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.578703 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.593434 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.604238 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9"] Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.604691 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ng7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-5f4q4_openstack-operators(09ccace8-b972-48ae-a15d-ecf88a300105): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.605039 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.198:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jvw4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6bbb97ddc6-fx4tn_openstack-operators(1cec8fc8-2b7b-4332-92a4-05483486f925): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.605074 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6zpd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-mkrpp_openstack-operators(5d4eb1a9-137a-4959-9d37-d81ee9c6dd54): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.606237 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" podUID="5d4eb1a9-137a-4959-9d37-d81ee9c6dd54" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.606299 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" podUID="09ccace8-b972-48ae-a15d-ecf88a300105" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.606328 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" podUID="1cec8fc8-2b7b-4332-92a4-05483486f925" Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.606531 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4"] Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.608483 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j7wp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-s97rq_openstack-operators(30843195-75a4-4b59-9193-dacda845ace7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:49:33 crc kubenswrapper[4845]: E0202 10:49:33.609752 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" podUID="30843195-75a4-4b59-9193-dacda845ace7" Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.612711 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l"] Feb 02 10:49:33 crc kubenswrapper[4845]: I0202 10:49:33.618900 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq"] Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.206471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" event={"ID":"a1c4a4d1-3974-47c1-9efc-ee88a38e13a5","Type":"ContainerStarted","Data":"ea16e763f815e83f3b4755f02dd90815b54d0664473503229789b51bc66f4f4f"} Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.208256 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" event={"ID":"1cec8fc8-2b7b-4332-92a4-05483486f925","Type":"ContainerStarted","Data":"9adc09b42771a038d2cc7c7ec8b623e5ab5be0ad5f0e65b1430046edae6344bf"} Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.210584 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.198:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" podUID="1cec8fc8-2b7b-4332-92a4-05483486f925" Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.213096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" event={"ID":"eae9c104-9193-4404-b25a-3a47932ef374","Type":"ContainerStarted","Data":"44a42a8e7fbf3e33a7945a482082f81cae51f7bf0e53fd9f69a6a1851fcc8928"} Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.216123 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" event={"ID":"817413ef-6c47-47ec-8e08-8dffd27c1e11","Type":"ContainerStarted","Data":"ff786ddb469786a1a25136331be442467bb860318b6ff32935fca4585686974d"} Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.218862 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" event={"ID":"09ccace8-b972-48ae-a15d-ecf88a300105","Type":"ContainerStarted","Data":"413b69799994ff2d087741ca276b0aa54e63135cd033861c59502598eea46821"} Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.220696 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" podUID="09ccace8-b972-48ae-a15d-ecf88a300105" Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.227411 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" event={"ID":"568bf546-0674-4dbd-91d8-9497c682e368","Type":"ContainerStarted","Data":"2d6909ab02a4ae9cbe8ec2219c9cd4f1dcfa93de71806f5c84767b9de149ee73"} Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.230096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" event={"ID":"30843195-75a4-4b59-9193-dacda845ace7","Type":"ContainerStarted","Data":"1575921b56f986bb3b342a7733fd3080f25db04773df72e09ef74224f17fb3b5"} Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.231464 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" podUID="30843195-75a4-4b59-9193-dacda845ace7" Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.232281 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" event={"ID":"5d4eb1a9-137a-4959-9d37-d81ee9c6dd54","Type":"ContainerStarted","Data":"dcdb10030bfbb72a5c1aebce163ff7e15cf200a78777304ca4418deec6dfb48b"} Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.233702 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" podUID="5d4eb1a9-137a-4959-9d37-d81ee9c6dd54" Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.234647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" event={"ID":"39f98254-3b87-4ac2-be8c-7d7a0f29d6ce","Type":"ContainerStarted","Data":"c84db55a11f73b38ba79690b9609ed7a732d13bb75727416a3ec839932214a0c"} Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.258509 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.259336 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.259380 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:38.259365822 +0000 UTC m=+1059.350767272 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:34 crc kubenswrapper[4845]: I0202 10:49:34.765904 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.766050 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:34 crc kubenswrapper[4845]: E0202 10:49:34.766384 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert podName:146aa38c-b63c-485a-9c55-006031cfcaa0 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:38.766364439 +0000 UTC m=+1059.857765879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" (UID: "146aa38c-b63c-485a-9c55-006031cfcaa0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:35 crc kubenswrapper[4845]: I0202 10:49:35.173049 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:35 crc kubenswrapper[4845]: I0202 10:49:35.173170 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.173269 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.173331 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.173348 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:39.173329616 +0000 UTC m=+1060.264731066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.173379 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:39.173364947 +0000 UTC m=+1060.264766397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.244854 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" podUID="30843195-75a4-4b59-9193-dacda845ace7" Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.244966 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" podUID="5d4eb1a9-137a-4959-9d37-d81ee9c6dd54" Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.245176 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" podUID="09ccace8-b972-48ae-a15d-ecf88a300105" Feb 02 10:49:35 crc kubenswrapper[4845]: E0202 10:49:35.246927 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.198:5001/openstack-k8s-operators/telemetry-operator:b2c5dab9eea05a087b6cf44f7f86794324fc86bd\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" podUID="1cec8fc8-2b7b-4332-92a4-05483486f925" Feb 02 10:49:38 crc kubenswrapper[4845]: I0202 10:49:38.329932 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:38 crc kubenswrapper[4845]: E0202 10:49:38.330135 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:38 crc kubenswrapper[4845]: E0202 10:49:38.330449 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:46.330424522 +0000 UTC m=+1067.421826042 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:38 crc kubenswrapper[4845]: I0202 10:49:38.840373 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:38 crc kubenswrapper[4845]: E0202 10:49:38.840540 4845 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:38 crc kubenswrapper[4845]: E0202 10:49:38.840588 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert podName:146aa38c-b63c-485a-9c55-006031cfcaa0 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:46.840573767 +0000 UTC m=+1067.931975217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert") pod "openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" (UID: "146aa38c-b63c-485a-9c55-006031cfcaa0") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:49:39 crc kubenswrapper[4845]: I0202 10:49:39.249939 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:39 crc kubenswrapper[4845]: I0202 10:49:39.250140 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:39 crc kubenswrapper[4845]: E0202 10:49:39.250146 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:39 crc kubenswrapper[4845]: E0202 10:49:39.250212 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:47.25019319 +0000 UTC m=+1068.341594640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:39 crc kubenswrapper[4845]: E0202 10:49:39.250372 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:39 crc kubenswrapper[4845]: E0202 10:49:39.250469 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:49:47.250448257 +0000 UTC m=+1068.341849757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.237506 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.239952 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.396530 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:49:46 crc kubenswrapper[4845]: E0202 10:49:46.396718 4845 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:46 crc kubenswrapper[4845]: E0202 10:49:46.396792 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert podName:f3c02aa0-5039-4a4f-ae11-1bac119f7e31 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:02.396772109 +0000 UTC m=+1083.488173569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert") pod "infra-operator-controller-manager-79955696d6-m55cc" (UID: "f3c02aa0-5039-4a4f-ae11-1bac119f7e31") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.904794 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.910733 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/146aa38c-b63c-485a-9c55-006031cfcaa0-cert\") pod \"openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh\" (UID: \"146aa38c-b63c-485a-9c55-006031cfcaa0\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:46 crc kubenswrapper[4845]: I0202 10:49:46.951606 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.084789 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8" Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.085066 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6x7k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-5fb775575f-msjcj_openstack-operators(1b72ed0e-9df5-459f-8ca9-de19874a3018): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.086305 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" podUID="1b72ed0e-9df5-459f-8ca9-de19874a3018" Feb 02 10:49:47 crc kubenswrapper[4845]: I0202 10:49:47.312731 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.312965 4845 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.314109 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:03.314085948 +0000 UTC m=+1084.405487408 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "webhook-server-cert" not found Feb 02 10:49:47 crc kubenswrapper[4845]: I0202 10:49:47.314219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.314445 4845 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.314554 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs podName:bd7f3a0c-1bdf-4673-b657-f56e7040f2a1 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:03.314532081 +0000 UTC m=+1084.405933621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs") pod "openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" (UID: "bd7f3a0c-1bdf-4673-b657-f56e7040f2a1") : secret "metrics-server-cert" not found Feb 02 10:49:47 crc kubenswrapper[4845]: E0202 10:49:47.825104 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:027cd7ab61ef5071d9ad6b729c95a98e51cd254642f01dc019d44cc98a9232f8\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" podUID="1b72ed0e-9df5-459f-8ca9-de19874a3018" Feb 02 10:49:48 crc kubenswrapper[4845]: E0202 10:49:48.574787 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521" Feb 02 10:49:48 crc kubenswrapper[4845]: E0202 10:49:48.575259 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2r8tn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-5f4b8bd54d-bc2xw_openstack-operators(36101b5e-a4ec-42b8-bb19-1cd2df2897c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:48 crc kubenswrapper[4845]: E0202 10:49:48.576970 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" podUID="36101b5e-a4ec-42b8-bb19-1cd2df2897c6" Feb 02 10:49:49 crc kubenswrapper[4845]: E0202 10:49:49.367147 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:bead175f27e5f074f723694f3b66e5aa7238411bf8a27a267b9a2936e4465521\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" podUID="36101b5e-a4ec-42b8-bb19-1cd2df2897c6" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.323853 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.324129 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h5ml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7dd968899f-plj9z_openstack-operators(de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.325276 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" podUID="de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.374711 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:cd911e8d7a7a1104d77691dbaaf54370015cbb82859337746db5a9186d5dc566\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" podUID="de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.935445 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.935678 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-76zt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d874c8fc-c4jdf_openstack-operators(d9196fe1-4a04-44c1-9a5f-1ad5de52da7f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:50 crc kubenswrapper[4845]: E0202 10:49:50.937503 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" podUID="d9196fe1-4a04-44c1-9a5f-1ad5de52da7f" Feb 02 10:49:51 crc kubenswrapper[4845]: E0202 10:49:51.384521 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:6e21a1dda86ba365817102d23a5d4d2d5dcd1c4d8e5f8d74bd24548aa8c63898\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" podUID="d9196fe1-4a04-44c1-9a5f-1ad5de52da7f" Feb 02 10:49:53 crc kubenswrapper[4845]: E0202 10:49:53.557182 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Feb 02 10:49:53 crc kubenswrapper[4845]: E0202 10:49:53.557945 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v26qh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-t89t9_openstack-operators(568bf546-0674-4dbd-91d8-9497c682e368): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:53 crc kubenswrapper[4845]: E0202 10:49:53.559375 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" podUID="568bf546-0674-4dbd-91d8-9497c682e368" Feb 02 10:49:54 crc kubenswrapper[4845]: E0202 10:49:54.405479 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" podUID="568bf546-0674-4dbd-91d8-9497c682e368" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.364966 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.365380 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbq5x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-585dbc889-hnt9f_openstack-operators(a1c4a4d1-3974-47c1-9efc-ee88a38e13a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.367270 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" podUID="a1c4a4d1-3974-47c1-9efc-ee88a38e13a5" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.413822 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:bbb46b8b3b69fdfad7bafc10a7e88f6ea58bcdc3c91e30beb79e24417d52e0f6\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" podUID="a1c4a4d1-3974-47c1-9efc-ee88a38e13a5" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.894898 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.895081 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtnjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68fc8c869-w7bxj_openstack-operators(817413ef-6c47-47ec-8e08-8dffd27c1e11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:55 crc kubenswrapper[4845]: E0202 10:49:55.897097 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" podUID="817413ef-6c47-47ec-8e08-8dffd27c1e11" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.292030 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.292205 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f46rz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-84f48565d4-w55l4_openstack-operators(7cc6d028-e9d2-459c-b34c-d069917832a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.293444 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" podUID="7cc6d028-e9d2-459c-b34c-d069917832a4" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.421021 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:319c969e88f109b26487a9f5a67203682803d7386424703ab7ca0340be99ae17\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" podUID="7cc6d028-e9d2-459c-b34c-d069917832a4" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.421639 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:42ad717de1b82267d244b016e5491a5b66a5c3deb6b8c2906a379e1296a2c382\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" podUID="817413ef-6c47-47ec-8e08-8dffd27c1e11" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.816577 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.816748 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jtjjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-55bff696bd-p98cd_openstack-operators(cac06f19-af65-481d-b739-68375e8d2968): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:56 crc kubenswrapper[4845]: E0202 10:49:56.818219 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" podUID="cac06f19-af65-481d-b739-68375e8d2968" Feb 02 10:49:57 crc kubenswrapper[4845]: E0202 10:49:57.429349 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:5340b88039fac393da49ef4e181b2720c809c27a6bb30531a07a49342a1da45e\\\"\"" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" podUID="cac06f19-af65-481d-b739-68375e8d2968" Feb 02 10:49:59 crc kubenswrapper[4845]: E0202 10:49:59.934648 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 02 10:49:59 crc kubenswrapper[4845]: E0202 10:49:59.935401 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4vgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-z5f9l_openstack-operators(39f98254-3b87-4ac2-be8c-7d7a0f29d6ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:59 crc kubenswrapper[4845]: E0202 10:49:59.936716 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" podUID="39f98254-3b87-4ac2-be8c-7d7a0f29d6ce" Feb 02 10:50:00 crc kubenswrapper[4845]: E0202 10:50:00.473526 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" podUID="39f98254-3b87-4ac2-be8c-7d7a0f29d6ce" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.424397 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.434929 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3c02aa0-5039-4a4f-ae11-1bac119f7e31-cert\") pod \"infra-operator-controller-manager-79955696d6-m55cc\" (UID: \"f3c02aa0-5039-4a4f-ae11-1bac119f7e31\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.490752 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" event={"ID":"202de28c-c44a-43d9-98fd-4b34b1dcc65f","Type":"ContainerStarted","Data":"6ea86470b646489dc04ee02411af87253bbf6fae406231373de24e474b5d3f44"} Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.490946 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.508118 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" podStartSLOduration=11.810277426 podStartE2EDuration="33.508099168s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:31.848876989 +0000 UTC m=+1052.940278439" lastFinishedPulling="2026-02-02 10:49:53.546698731 +0000 UTC m=+1074.638100181" observedRunningTime="2026-02-02 10:50:02.505427832 +0000 UTC m=+1083.596829292" watchObservedRunningTime="2026-02-02 10:50:02.508099168 +0000 UTC m=+1083.599500618" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.674707 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-dzl6d" Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.675587 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh"] Feb 02 10:50:02 crc kubenswrapper[4845]: I0202 10:50:02.682995 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:50:02 crc kubenswrapper[4845]: W0202 10:50:02.701260 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146aa38c_b63c_485a_9c55_006031cfcaa0.slice/crio-0cf5e906c1bdf92cea6afcc0300e1de4d45ece67b02a2b70b0748f4f2f9a38d2 WatchSource:0}: Error finding container 0cf5e906c1bdf92cea6afcc0300e1de4d45ece67b02a2b70b0748f4f2f9a38d2: Status 404 returned error can't find the container with id 0cf5e906c1bdf92cea6afcc0300e1de4d45ece67b02a2b70b0748f4f2f9a38d2 Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.310008 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-m55cc"] Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.368040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.368193 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.377817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-webhook-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.381530 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd7f3a0c-1bdf-4673-b657-f56e7040f2a1-metrics-certs\") pod \"openstack-operator-controller-manager-5b7c7bb6c9-k6h5z\" (UID: \"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1\") " pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.464255 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zbcwm" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.470984 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.523495 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" event={"ID":"1cec8fc8-2b7b-4332-92a4-05483486f925","Type":"ContainerStarted","Data":"2f113d242413af7ec6c418077c89369d62767a51df2767c705c532067a561d01"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.524461 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.531678 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" event={"ID":"eae9c104-9193-4404-b25a-3a47932ef374","Type":"ContainerStarted","Data":"722d8884135133c8161e049437f8b1919dd1c02f37c836f83c85d7181e1e5c15"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.533033 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.534345 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" event={"ID":"85439e8a-f7d3-4e0b-827c-bf27e8cd53dd","Type":"ContainerStarted","Data":"bcf6c338cc6514e6540f97f7b0fbadc7fe5a207eb9a4ab07a2c9f7ef00d555e1"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.534873 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.535951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" event={"ID":"146aa38c-b63c-485a-9c55-006031cfcaa0","Type":"ContainerStarted","Data":"0cf5e906c1bdf92cea6afcc0300e1de4d45ece67b02a2b70b0748f4f2f9a38d2"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.556691 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" podStartSLOduration=4.807866013 podStartE2EDuration="33.556668009s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.604811803 +0000 UTC m=+1054.696213253" lastFinishedPulling="2026-02-02 10:50:02.353613799 +0000 UTC m=+1083.445015249" observedRunningTime="2026-02-02 10:50:03.53655443 +0000 UTC m=+1084.627955880" watchObservedRunningTime="2026-02-02 10:50:03.556668009 +0000 UTC m=+1084.648069459" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.574027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" event={"ID":"efa2be30-a7d0-4b26-865a-58448de203a0","Type":"ContainerStarted","Data":"5db06f0e6d4fe53fcc1bdab4ffdf8dc864fb27e5d461e801d85921f70ab549f5"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.574075 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.598202 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" event={"ID":"30843195-75a4-4b59-9193-dacda845ace7","Type":"ContainerStarted","Data":"32eefedddb314cb1cfb917f22e3903c5cc7aa7bf14546aa3765816e2ba63c4bd"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.599203 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.599291 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" podStartSLOduration=11.393228334 podStartE2EDuration="34.599272814s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.600643405 +0000 UTC m=+1054.692044855" lastFinishedPulling="2026-02-02 10:49:56.806687865 +0000 UTC m=+1077.898089335" observedRunningTime="2026-02-02 10:50:03.56837312 +0000 UTC m=+1084.659774580" watchObservedRunningTime="2026-02-02 10:50:03.599272814 +0000 UTC m=+1084.690674264" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.635906 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" event={"ID":"70403789-9865-4c4d-a969-118a157e564e","Type":"ContainerStarted","Data":"26054c7430719219d22c442c133c930866732ad30b24f2f446ce5c1e3c070d8b"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.636323 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.638124 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" event={"ID":"de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d","Type":"ContainerStarted","Data":"75966db573248fbe4b0b90602ebe3726e8063b21b0e0a26857b5738c057756e3"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.638693 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" podStartSLOduration=11.916782337 podStartE2EDuration="34.636337181s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.631999734 +0000 UTC m=+1053.723401194" lastFinishedPulling="2026-02-02 10:49:55.351554578 +0000 UTC m=+1076.442956038" observedRunningTime="2026-02-02 10:50:03.599262693 +0000 UTC m=+1084.690664143" watchObservedRunningTime="2026-02-02 10:50:03.636337181 +0000 UTC m=+1084.727738631" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.638961 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.640361 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" event={"ID":"745626d8-548b-43bb-aee8-eeab34a86427","Type":"ContainerStarted","Data":"f6b1058879fd6329016e9b172d3d8474ffc4d2332bfdd3cf26b371af7171f2ab"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.640632 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.643484 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" event={"ID":"09ccace8-b972-48ae-a15d-ecf88a300105","Type":"ContainerStarted","Data":"6430dec9a81641179f6b9d383f39a0f49c4e16e4e9e0b4fd782ee73143f0dbba"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.643722 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.670473 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" podStartSLOduration=10.530877596 podStartE2EDuration="34.670455785s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.667090816 +0000 UTC m=+1053.758492266" lastFinishedPulling="2026-02-02 10:49:56.806669005 +0000 UTC m=+1077.898070455" observedRunningTime="2026-02-02 10:50:03.62495043 +0000 UTC m=+1084.716351880" watchObservedRunningTime="2026-02-02 10:50:03.670455785 +0000 UTC m=+1084.761857235" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.686665 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" event={"ID":"1b72ed0e-9df5-459f-8ca9-de19874a3018","Type":"ContainerStarted","Data":"cdaa6d538e1d17c85f1173a2b704e4222ea1b6ebb6ce68cb7469fe9f403b4e11"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.687652 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.688450 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" podStartSLOduration=6.11226639 podStartE2EDuration="34.688423894s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.608321092 +0000 UTC m=+1054.699722542" lastFinishedPulling="2026-02-02 10:50:02.184478566 +0000 UTC m=+1083.275880046" observedRunningTime="2026-02-02 10:50:03.660439692 +0000 UTC m=+1084.751841152" watchObservedRunningTime="2026-02-02 10:50:03.688423894 +0000 UTC m=+1084.779825344" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.692104 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" podStartSLOduration=5.065203791 podStartE2EDuration="34.692095787s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.933254523 +0000 UTC m=+1054.024655973" lastFinishedPulling="2026-02-02 10:50:02.560146519 +0000 UTC m=+1083.651547969" observedRunningTime="2026-02-02 10:50:03.686287373 +0000 UTC m=+1084.777688833" watchObservedRunningTime="2026-02-02 10:50:03.692095787 +0000 UTC m=+1084.783497237" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.693615 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" event={"ID":"f3c02aa0-5039-4a4f-ae11-1bac119f7e31","Type":"ContainerStarted","Data":"59895553f141e9b65908c907a5522db71611bbf9d034da48cd98009d96aa7289"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.713094 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" event={"ID":"5d4eb1a9-137a-4959-9d37-d81ee9c6dd54","Type":"ContainerStarted","Data":"0062025f5074397dfd62fa8e706a8a04e9775f3ffecbed011e7c507f92d33c05"} Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.713581 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.720160 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" podStartSLOduration=5.11652135 podStartE2EDuration="33.720145651s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.604431222 +0000 UTC m=+1054.695832672" lastFinishedPulling="2026-02-02 10:50:02.208055523 +0000 UTC m=+1083.299456973" observedRunningTime="2026-02-02 10:50:03.719369659 +0000 UTC m=+1084.810771129" watchObservedRunningTime="2026-02-02 10:50:03.720145651 +0000 UTC m=+1084.811547101" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.784920 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" podStartSLOduration=6.821314207 podStartE2EDuration="33.784903642s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.940326583 +0000 UTC m=+1054.031728033" lastFinishedPulling="2026-02-02 10:49:59.903916008 +0000 UTC m=+1080.995317468" observedRunningTime="2026-02-02 10:50:03.751383974 +0000 UTC m=+1084.842785424" watchObservedRunningTime="2026-02-02 10:50:03.784903642 +0000 UTC m=+1084.876305092" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.795871 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" podStartSLOduration=11.146524454 podStartE2EDuration="34.795853481s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.624270985 +0000 UTC m=+1053.715672435" lastFinishedPulling="2026-02-02 10:49:56.273600012 +0000 UTC m=+1077.365001462" observedRunningTime="2026-02-02 10:50:03.784170341 +0000 UTC m=+1084.875571791" watchObservedRunningTime="2026-02-02 10:50:03.795853481 +0000 UTC m=+1084.887254931" Feb 02 10:50:03 crc kubenswrapper[4845]: I0202 10:50:03.815289 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" podStartSLOduration=5.146939411 podStartE2EDuration="33.815273281s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.604878875 +0000 UTC m=+1054.696280325" lastFinishedPulling="2026-02-02 10:50:02.273212745 +0000 UTC m=+1083.364614195" observedRunningTime="2026-02-02 10:50:03.810103514 +0000 UTC m=+1084.901504964" watchObservedRunningTime="2026-02-02 10:50:03.815273281 +0000 UTC m=+1084.906674731" Feb 02 10:50:04 crc kubenswrapper[4845]: I0202 10:50:04.291191 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" podStartSLOduration=5.581081659 podStartE2EDuration="35.291174118s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.610178337 +0000 UTC m=+1053.701579787" lastFinishedPulling="2026-02-02 10:50:02.320270796 +0000 UTC m=+1083.411672246" observedRunningTime="2026-02-02 10:50:03.852299348 +0000 UTC m=+1084.943700788" watchObservedRunningTime="2026-02-02 10:50:04.291174118 +0000 UTC m=+1085.382575568" Feb 02 10:50:04 crc kubenswrapper[4845]: I0202 10:50:04.296197 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z"] Feb 02 10:50:04 crc kubenswrapper[4845]: I0202 10:50:04.727708 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" event={"ID":"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1","Type":"ContainerStarted","Data":"0ee7e50f4b1d8bbbf13b392ab35249694e3d1477fb23b99ab648741c80100e2d"} Feb 02 10:50:04 crc kubenswrapper[4845]: I0202 10:50:04.728109 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" event={"ID":"bd7f3a0c-1bdf-4673-b657-f56e7040f2a1","Type":"ContainerStarted","Data":"6b013b6451099395f51bd0340e59ce991874f6a2891e01de35eda48d8b26c07a"} Feb 02 10:50:04 crc kubenswrapper[4845]: I0202 10:50:04.790724 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" podStartSLOduration=33.790701433 podStartE2EDuration="33.790701433s" podCreationTimestamp="2026-02-02 10:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:04.773552258 +0000 UTC m=+1085.864953728" watchObservedRunningTime="2026-02-02 10:50:04.790701433 +0000 UTC m=+1085.882102893" Feb 02 10:50:05 crc kubenswrapper[4845]: I0202 10:50:05.754536 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.771251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" event={"ID":"36101b5e-a4ec-42b8-bb19-1cd2df2897c6","Type":"ContainerStarted","Data":"8aab7c1a27a4d6c2760f415001de7ce65703efc6049cf580d2ccec9c43d8aef1"} Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.773054 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.774467 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" event={"ID":"f3c02aa0-5039-4a4f-ae11-1bac119f7e31","Type":"ContainerStarted","Data":"5652d6caff502b45b75b50cd13c6239a51e6558a6efdac0f0a5130c1f95e25f9"} Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.774919 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.779524 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" event={"ID":"146aa38c-b63c-485a-9c55-006031cfcaa0","Type":"ContainerStarted","Data":"8bf080431198d6bb12c1543f2c99d84aea51dd066ee236dd12512578e886c92c"} Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.780388 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.786285 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" event={"ID":"d9196fe1-4a04-44c1-9a5f-1ad5de52da7f","Type":"ContainerStarted","Data":"ec879a488fbfb2d1d0ce4a5f06c9d0da79178eb12f2240b90633f9f7c93ab0fe"} Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.786709 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.794767 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" podStartSLOduration=4.33490603 podStartE2EDuration="37.794747483s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.931357509 +0000 UTC m=+1054.022758959" lastFinishedPulling="2026-02-02 10:50:06.391198962 +0000 UTC m=+1087.482600412" observedRunningTime="2026-02-02 10:50:06.791319416 +0000 UTC m=+1087.882720866" watchObservedRunningTime="2026-02-02 10:50:06.794747483 +0000 UTC m=+1087.886148933" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.820401 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" podStartSLOduration=34.136847117 podStartE2EDuration="37.820381138s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:50:02.70727918 +0000 UTC m=+1083.798680630" lastFinishedPulling="2026-02-02 10:50:06.390813201 +0000 UTC m=+1087.482214651" observedRunningTime="2026-02-02 10:50:06.815179641 +0000 UTC m=+1087.906581091" watchObservedRunningTime="2026-02-02 10:50:06.820381138 +0000 UTC m=+1087.911782588" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.834954 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" podStartSLOduration=4.079619072 podStartE2EDuration="37.83493552s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.662452665 +0000 UTC m=+1053.753854115" lastFinishedPulling="2026-02-02 10:50:06.417769073 +0000 UTC m=+1087.509170563" observedRunningTime="2026-02-02 10:50:06.832639055 +0000 UTC m=+1087.924040545" watchObservedRunningTime="2026-02-02 10:50:06.83493552 +0000 UTC m=+1087.926336970" Feb 02 10:50:06 crc kubenswrapper[4845]: I0202 10:50:06.848620 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" podStartSLOduration=34.76500265 podStartE2EDuration="37.848602026s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:50:03.311180527 +0000 UTC m=+1084.402581977" lastFinishedPulling="2026-02-02 10:50:06.394779893 +0000 UTC m=+1087.486181353" observedRunningTime="2026-02-02 10:50:06.846338352 +0000 UTC m=+1087.937739812" watchObservedRunningTime="2026-02-02 10:50:06.848602026 +0000 UTC m=+1087.940003476" Feb 02 10:50:08 crc kubenswrapper[4845]: I0202 10:50:08.799614 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" event={"ID":"817413ef-6c47-47ec-8e08-8dffd27c1e11","Type":"ContainerStarted","Data":"181f746d2bf3ca1d10c7c0ff5cf39c881cdca0bb5824121e7ba211ea2cf8c2ab"} Feb 02 10:50:08 crc kubenswrapper[4845]: I0202 10:50:08.800185 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:50:08 crc kubenswrapper[4845]: I0202 10:50:08.819095 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" podStartSLOduration=4.68471048 podStartE2EDuration="38.819077276s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.600960504 +0000 UTC m=+1054.692361954" lastFinishedPulling="2026-02-02 10:50:07.73532729 +0000 UTC m=+1088.826728750" observedRunningTime="2026-02-02 10:50:08.816802232 +0000 UTC m=+1089.908203682" watchObservedRunningTime="2026-02-02 10:50:08.819077276 +0000 UTC m=+1089.910478726" Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.813098 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" event={"ID":"568bf546-0674-4dbd-91d8-9497c682e368","Type":"ContainerStarted","Data":"da34a8fd9d9d0955f4723dec5a28ea7662aa005abfa7f4b862c1db7965ad802a"} Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.814194 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.817015 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" event={"ID":"a1c4a4d1-3974-47c1-9efc-ee88a38e13a5","Type":"ContainerStarted","Data":"0dc33da4c4f0c611ced8f2c0713bcb03cd29cc6170b20f4b4bd2712e93200b15"} Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.817355 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.833545 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" podStartSLOduration=5.01070095 podStartE2EDuration="40.833528162s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.529171754 +0000 UTC m=+1054.620573204" lastFinishedPulling="2026-02-02 10:50:09.351998976 +0000 UTC m=+1090.443400416" observedRunningTime="2026-02-02 10:50:09.828675615 +0000 UTC m=+1090.920077085" watchObservedRunningTime="2026-02-02 10:50:09.833528162 +0000 UTC m=+1090.924929612" Feb 02 10:50:09 crc kubenswrapper[4845]: I0202 10:50:09.846758 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" podStartSLOduration=5.17123841 podStartE2EDuration="40.846739276s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.611145202 +0000 UTC m=+1054.702546652" lastFinishedPulling="2026-02-02 10:50:09.286646068 +0000 UTC m=+1090.378047518" observedRunningTime="2026-02-02 10:50:09.845374877 +0000 UTC m=+1090.936776347" watchObservedRunningTime="2026-02-02 10:50:09.846739276 +0000 UTC m=+1090.938140726" Feb 02 10:50:10 crc kubenswrapper[4845]: I0202 10:50:10.079473 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-cfvq7" Feb 02 10:50:10 crc kubenswrapper[4845]: I0202 10:50:10.129407 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-qfprx" Feb 02 10:50:10 crc kubenswrapper[4845]: I0202 10:50:10.133898 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-9c2wv" Feb 02 10:50:10 crc kubenswrapper[4845]: I0202 10:50:10.192219 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-pdfcx" Feb 02 10:50:10 crc kubenswrapper[4845]: I0202 10:50:10.221230 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-msjcj" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.133534 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-plj9z" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.136007 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-bc2xw" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.330709 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-s97rq" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.343417 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-lt2wj" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.366747 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9ltr5" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.400724 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6bbb97ddc6-fx4tn" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.420759 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-5f4q4" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.436864 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-mkrpp" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.834073 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" event={"ID":"7cc6d028-e9d2-459c-b34c-d069917832a4","Type":"ContainerStarted","Data":"3b139b41d46c8bbff04fc6dca47a8f78e5d94e56ca37cc12fd705c09fed358dc"} Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.834326 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.835977 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" event={"ID":"cac06f19-af65-481d-b739-68375e8d2968","Type":"ContainerStarted","Data":"a681fbd60d79135f7f92fdd0abea76e3d9dced97e68f5dc30128161d225211cd"} Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.836670 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.861559 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" podStartSLOduration=5.114107413 podStartE2EDuration="42.861538239s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.941374502 +0000 UTC m=+1054.032775952" lastFinishedPulling="2026-02-02 10:50:10.688805328 +0000 UTC m=+1091.780206778" observedRunningTime="2026-02-02 10:50:11.856075704 +0000 UTC m=+1092.947477164" watchObservedRunningTime="2026-02-02 10:50:11.861538239 +0000 UTC m=+1092.952939689" Feb 02 10:50:11 crc kubenswrapper[4845]: I0202 10:50:11.882072 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" podStartSLOduration=5.0390303 podStartE2EDuration="42.882054739s" podCreationTimestamp="2026-02-02 10:49:29 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.933951082 +0000 UTC m=+1054.025352532" lastFinishedPulling="2026-02-02 10:50:10.776975521 +0000 UTC m=+1091.868376971" observedRunningTime="2026-02-02 10:50:11.874532336 +0000 UTC m=+1092.965933786" watchObservedRunningTime="2026-02-02 10:50:11.882054739 +0000 UTC m=+1092.973456189" Feb 02 10:50:12 crc kubenswrapper[4845]: I0202 10:50:12.693201 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-m55cc" Feb 02 10:50:13 crc kubenswrapper[4845]: I0202 10:50:13.476754 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b7c7bb6c9-k6h5z" Feb 02 10:50:15 crc kubenswrapper[4845]: I0202 10:50:15.873383 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" event={"ID":"39f98254-3b87-4ac2-be8c-7d7a0f29d6ce","Type":"ContainerStarted","Data":"30dd9ec48d8e6cce67dc128578c6cea6011738707c4cedc9b481031750c549ad"} Feb 02 10:50:15 crc kubenswrapper[4845]: I0202 10:50:15.898380 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-z5f9l" podStartSLOduration=3.331175683 podStartE2EDuration="44.89835342s" podCreationTimestamp="2026-02-02 10:49:31 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.601262963 +0000 UTC m=+1054.692664413" lastFinishedPulling="2026-02-02 10:50:15.1684407 +0000 UTC m=+1096.259842150" observedRunningTime="2026-02-02 10:50:15.893629627 +0000 UTC m=+1096.985031077" watchObservedRunningTime="2026-02-02 10:50:15.89835342 +0000 UTC m=+1096.989754870" Feb 02 10:50:16 crc kubenswrapper[4845]: I0202 10:50:16.238168 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:50:16 crc kubenswrapper[4845]: I0202 10:50:16.238619 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:50:16 crc kubenswrapper[4845]: I0202 10:50:16.959674 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh" Feb 02 10:50:20 crc kubenswrapper[4845]: I0202 10:50:20.077458 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-c4jdf" Feb 02 10:50:21 crc kubenswrapper[4845]: I0202 10:50:21.100346 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-w55l4" Feb 02 10:50:21 crc kubenswrapper[4845]: I0202 10:50:21.126352 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-p98cd" Feb 02 10:50:21 crc kubenswrapper[4845]: I0202 10:50:21.140220 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-t89t9" Feb 02 10:50:21 crc kubenswrapper[4845]: I0202 10:50:21.151123 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-hnt9f" Feb 02 10:50:21 crc kubenswrapper[4845]: I0202 10:50:21.378704 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-w7bxj" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.870241 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.872362 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.875543 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.875757 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.875860 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.875993 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5d6j8" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.885975 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.926439 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.928404 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.930350 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 02 10:50:39 crc kubenswrapper[4845]: I0202 10:50:39.945259 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.019648 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.019757 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.019808 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.020035 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6dgn\" (UniqueName: \"kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.020111 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwxqg\" (UniqueName: \"kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.122190 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.122278 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.122389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6dgn\" (UniqueName: \"kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.122430 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwxqg\" (UniqueName: \"kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.122487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.123134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.123161 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.123638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.141406 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6dgn\" (UniqueName: \"kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn\") pod \"dnsmasq-dns-675f4bcbfc-j9fpl\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.145646 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwxqg\" (UniqueName: \"kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg\") pod \"dnsmasq-dns-78dd6ddcc-pxsn7\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.207602 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.250541 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.755247 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:50:40 crc kubenswrapper[4845]: I0202 10:50:40.849315 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:50:40 crc kubenswrapper[4845]: W0202 10:50:40.854717 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ff22c8_ded6_4209_9503_f1e66526c1d5.slice/crio-3feed3c8637a37b93c2f596a1883e1905a851bfbc1c2fe28022e86b8367a3d6d WatchSource:0}: Error finding container 3feed3c8637a37b93c2f596a1883e1905a851bfbc1c2fe28022e86b8367a3d6d: Status 404 returned error can't find the container with id 3feed3c8637a37b93c2f596a1883e1905a851bfbc1c2fe28022e86b8367a3d6d Feb 02 10:50:41 crc kubenswrapper[4845]: I0202 10:50:41.092999 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" event={"ID":"b15a2eeb-8248-40e0-b9a6-294ed99f1177","Type":"ContainerStarted","Data":"541d01d5e724cfd983149b2a3df13b41fdb7aca180b15b693908ce4201c3363e"} Feb 02 10:50:41 crc kubenswrapper[4845]: I0202 10:50:41.095004 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" event={"ID":"a5ff22c8-ded6-4209-9503-f1e66526c1d5","Type":"ContainerStarted","Data":"3feed3c8637a37b93c2f596a1883e1905a851bfbc1c2fe28022e86b8367a3d6d"} Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.661591 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.684640 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.687772 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.704387 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.779875 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.780084 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.780162 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ktfp\" (UniqueName: \"kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.882624 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ktfp\" (UniqueName: \"kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.883065 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.883109 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.883951 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.884175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.936084 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ktfp\" (UniqueName: \"kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp\") pod \"dnsmasq-dns-666b6646f7-d26cn\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:42 crc kubenswrapper[4845]: I0202 10:50:42.998628 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.013601 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.025654 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.027352 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.046853 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.191693 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.191804 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.191986 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq6c6\" (UniqueName: \"kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.294460 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq6c6\" (UniqueName: \"kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.294867 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.294940 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.295875 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.295946 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.315808 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq6c6\" (UniqueName: \"kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6\") pod \"dnsmasq-dns-57d769cc4f-w59pq\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.360875 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.584687 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.813432 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.815810 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.818601 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.818678 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.818706 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.818646 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.818734 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.819294 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tclgq" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.820005 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.841287 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.857437 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.859724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.865537 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.867331 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.877099 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 10:50:43 crc kubenswrapper[4845]: W0202 10:50:43.883820 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d63dd57_08d9_4913_b1d3_36a9c8b5db2e.slice/crio-7586b993b7c8e47124f3e9c3b1a81c730d51fb7fd5b1683f7e203dc16fdb11b3 WatchSource:0}: Error finding container 7586b993b7c8e47124f3e9c3b1a81c730d51fb7fd5b1683f7e203dc16fdb11b3: Status 404 returned error can't find the container with id 7586b993b7c8e47124f3e9c3b1a81c730d51fb7fd5b1683f7e203dc16fdb11b3 Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933371 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933490 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933555 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933594 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933665 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933726 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstjm\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-kube-api-access-rstjm\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933799 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.933966 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.934040 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.937195 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 10:50:43 crc kubenswrapper[4845]: I0202 10:50:43.952594 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.035592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.035668 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.035698 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6hdt\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-kube-api-access-w6hdt\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.035836 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.035944 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036003 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036027 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a61fa08e-868a-4415-88d5-7ed0eebbeb45-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036060 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036083 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036106 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz27x\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-kube-api-access-mz27x\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036143 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036213 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036255 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036290 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036375 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0a3a285-364a-4df2-8a7c-947ff673f254-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036422 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036453 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036481 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036527 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rstjm\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-kube-api-access-rstjm\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036556 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0a3a285-364a-4df2-8a7c-947ff673f254-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036587 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-config-data\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036621 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036658 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a61fa08e-868a-4415-88d5-7ed0eebbeb45-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036694 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036727 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036808 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036845 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036873 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.036909 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.038468 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.038768 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.039772 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.040571 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-config-data\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.040630 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.041467 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.041657 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.041981 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.042023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.042024 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3b6438ecda997c67dcc63770328ff6a865176e2ea3582236dd879581a55b9845/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.043595 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.048643 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.052690 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstjm\" (UniqueName: \"kubernetes.io/projected/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-kube-api-access-rstjm\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.052766 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2e45ad6a-20f4-4da2-82b7-500ed29a0cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.084638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17dd126b-742a-421d-90d5-fabe0fddb7cb\") pod \"rabbitmq-server-0\" (UID: \"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.141939 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142316 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6hdt\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-kube-api-access-w6hdt\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142341 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142381 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142400 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a61fa08e-868a-4415-88d5-7ed0eebbeb45-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142423 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142440 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz27x\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-kube-api-access-mz27x\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142475 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142503 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142533 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0a3a285-364a-4df2-8a7c-947ff673f254-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142563 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142582 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142608 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0a3a285-364a-4df2-8a7c-947ff673f254-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142630 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-config-data\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142651 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142683 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a61fa08e-868a-4415-88d5-7ed0eebbeb45-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142813 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-config-data\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.142878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.143404 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.148414 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-config-data\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.154595 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.154679 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.154915 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.162645 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.163416 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.163951 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d0a3a285-364a-4df2-8a7c-947ff673f254-server-conf\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.167206 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.167629 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" event={"ID":"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e","Type":"ContainerStarted","Data":"7586b993b7c8e47124f3e9c3b1a81c730d51fb7fd5b1683f7e203dc16fdb11b3"} Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.172145 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d0a3a285-364a-4df2-8a7c-947ff673f254-pod-info\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.172381 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.174561 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.174593 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e3c75a6c07d29ad3f6c197fd99c75e4b482cc857102eab05479c9e43ddaa56a8/globalmount\"" pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.175011 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.176025 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-server-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.176908 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.177525 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.177551 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.177679 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.177707 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7877f86ceeec28f55dcdfdad53738fad251a2adbd26577d424dd55d7064b7272/globalmount\"" pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.177764 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a61fa08e-868a-4415-88d5-7ed0eebbeb45-config-data\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.178371 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a61fa08e-868a-4415-88d5-7ed0eebbeb45-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.180182 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d0a3a285-364a-4df2-8a7c-947ff673f254-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.183565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" event={"ID":"511adc55-f919-42e9-961d-94565550d668","Type":"ContainerStarted","Data":"d2ea1511585c52b6d62ad84745c6c0e0adbcd9ee5f53b0b69c86b9217ec09f38"} Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.184038 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-84mvx" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.184064 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.183869 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.184098 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.184058 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.184193 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.185364 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.187100 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.190017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.192357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6hdt\" (UniqueName: \"kubernetes.io/projected/a61fa08e-868a-4415-88d5-7ed0eebbeb45-kube-api-access-w6hdt\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.195138 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz27x\" (UniqueName: \"kubernetes.io/projected/d0a3a285-364a-4df2-8a7c-947ff673f254-kube-api-access-mz27x\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.196444 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a61fa08e-868a-4415-88d5-7ed0eebbeb45-pod-info\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.219175 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8f3f8a22-543f-47e7-b070-4309ca65d7c4\") pod \"rabbitmq-server-1\" (UID: \"d0a3a285-364a-4df2-8a7c-947ff673f254\") " pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.239482 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bfbea8aa-e192-4e9b-8928-4b9078218059\") pod \"rabbitmq-server-2\" (UID: \"a61fa08e-868a-4415-88d5-7ed0eebbeb45\") " pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244570 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244688 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244752 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244770 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244845 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgdrd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-kube-api-access-sgdrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.244970 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.245055 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.245080 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.245130 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.245256 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.245341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.253622 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349274 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349349 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349426 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349502 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349587 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349616 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349658 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.349681 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.350731 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgdrd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-kube-api-access-sgdrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.350854 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.354642 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.355031 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.356748 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.357423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.357461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.364611 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.364608 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.365027 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.367632 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.368522 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.368577 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b2eb781001d93de0acf363dfef7c5efded5167520c7db69b373afc490c63ac37/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.396773 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgdrd\" (UniqueName: \"kubernetes.io/projected/70739f91-4fde-4bc2-b4e1-5bdb7cb0426c-kube-api-access-sgdrd\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.458475 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-160bfd2f-7d1e-4a3e-9cce-2c08e4673be2\") pod \"rabbitmq-cell1-server-0\" (UID: \"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.513507 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.612563 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:44 crc kubenswrapper[4845]: I0202 10:50:44.882839 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:50:45 crc kubenswrapper[4845]: W0202 10:50:45.010621 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61fa08e_868a_4415_88d5_7ed0eebbeb45.slice/crio-7388896d8c8e466f6fceaf6796e3bd83cb406ea71301a12e5ae9b626081454c5 WatchSource:0}: Error finding container 7388896d8c8e466f6fceaf6796e3bd83cb406ea71301a12e5ae9b626081454c5: Status 404 returned error can't find the container with id 7388896d8c8e466f6fceaf6796e3bd83cb406ea71301a12e5ae9b626081454c5 Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.012446 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.138422 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Feb 02 10:50:45 crc kubenswrapper[4845]: W0202 10:50:45.148668 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a3a285_364a_4df2_8a7c_947ff673f254.slice/crio-58ba90614d41bdc5c5b78de3603f770afb7f2d476f8ce14e4219f353162d7f61 WatchSource:0}: Error finding container 58ba90614d41bdc5c5b78de3603f770afb7f2d476f8ce14e4219f353162d7f61: Status 404 returned error can't find the container with id 58ba90614d41bdc5c5b78de3603f770afb7f2d476f8ce14e4219f353162d7f61 Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.205852 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a61fa08e-868a-4415-88d5-7ed0eebbeb45","Type":"ContainerStarted","Data":"7388896d8c8e466f6fceaf6796e3bd83cb406ea71301a12e5ae9b626081454c5"} Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.209368 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5","Type":"ContainerStarted","Data":"d42fb8b4a178f65e5d9a6f37454103c9b146f50a0c2805f2968ec18bab4db8bc"} Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.211293 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d0a3a285-364a-4df2-8a7c-947ff673f254","Type":"ContainerStarted","Data":"58ba90614d41bdc5c5b78de3603f770afb7f2d476f8ce14e4219f353162d7f61"} Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.250718 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:50:45 crc kubenswrapper[4845]: W0202 10:50:45.255250 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70739f91_4fde_4bc2_b4e1_5bdb7cb0426c.slice/crio-12d1eada5c850d971f8fc58216922ff0b3fd11985867d6916c8eee86d40b59bf WatchSource:0}: Error finding container 12d1eada5c850d971f8fc58216922ff0b3fd11985867d6916c8eee86d40b59bf: Status 404 returned error can't find the container with id 12d1eada5c850d971f8fc58216922ff0b3fd11985867d6916c8eee86d40b59bf Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.387423 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.390920 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.393523 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.394439 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.394576 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.394714 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-n7q8b" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.397365 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.404266 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473695 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473751 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473905 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e671cf92-e326-4d61-9db2-6fd521e03115\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e671cf92-e326-4d61-9db2-6fd521e03115\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473941 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.473957 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.474001 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7rs\" (UniqueName: \"kubernetes.io/projected/0c7d4707-dfce-464f-bffe-0d543bea6299-kube-api-access-zw7rs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.474020 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.575926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e671cf92-e326-4d61-9db2-6fd521e03115\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e671cf92-e326-4d61-9db2-6fd521e03115\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.575986 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576005 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576048 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7rs\" (UniqueName: \"kubernetes.io/projected/0c7d4707-dfce-464f-bffe-0d543bea6299-kube-api-access-zw7rs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576069 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576110 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576132 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.576184 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.577723 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-kolla-config\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.581506 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-default\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.581553 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7d4707-dfce-464f-bffe-0d543bea6299-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.581942 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.582344 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7d4707-dfce-464f-bffe-0d543bea6299-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.583307 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7d4707-dfce-464f-bffe-0d543bea6299-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.597847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7rs\" (UniqueName: \"kubernetes.io/projected/0c7d4707-dfce-464f-bffe-0d543bea6299-kube-api-access-zw7rs\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.603582 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.603740 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e671cf92-e326-4d61-9db2-6fd521e03115\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e671cf92-e326-4d61-9db2-6fd521e03115\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1d745164a5bb26e3bdf1097ec500132c10359a33d584430c1154c293f2cd55f5/globalmount\"" pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.701563 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e671cf92-e326-4d61-9db2-6fd521e03115\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e671cf92-e326-4d61-9db2-6fd521e03115\") pod \"openstack-galera-0\" (UID: \"0c7d4707-dfce-464f-bffe-0d543bea6299\") " pod="openstack/openstack-galera-0" Feb 02 10:50:45 crc kubenswrapper[4845]: I0202 10:50:45.732358 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.220288 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c","Type":"ContainerStarted","Data":"12d1eada5c850d971f8fc58216922ff0b3fd11985867d6916c8eee86d40b59bf"} Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.237956 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.238039 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.238125 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.239163 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.239214 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5" gracePeriod=600 Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.744832 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.746307 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.748879 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.749874 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.749972 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.749866 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5qbzz" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.774654 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.911693 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912055 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912105 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912210 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/25ccf740-cc48-4863-8a7d-98548588860f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912228 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912267 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:46 crc kubenswrapper[4845]: I0202 10:50:46.912293 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxj7\" (UniqueName: \"kubernetes.io/projected/25ccf740-cc48-4863-8a7d-98548588860f-kube-api-access-4mxj7\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014592 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014643 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014701 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/25ccf740-cc48-4863-8a7d-98548588860f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014718 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014760 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014785 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxj7\" (UniqueName: \"kubernetes.io/projected/25ccf740-cc48-4863-8a7d-98548588860f-kube-api-access-4mxj7\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014823 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.014857 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.019600 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.019683 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.019720 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/25ccf740-cc48-4863-8a7d-98548588860f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.020471 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/25ccf740-cc48-4863-8a7d-98548588860f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.022385 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.022417 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/73d5b639844b7af7811f16645b2a90d571f59d40f63aea33e6e979c9261cbf08/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.024381 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.031963 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25ccf740-cc48-4863-8a7d-98548588860f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.041864 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxj7\" (UniqueName: \"kubernetes.io/projected/25ccf740-cc48-4863-8a7d-98548588860f-kube-api-access-4mxj7\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.079830 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-34bff2ba-0ab2-484f-8457-823a0af4066c\") pod \"openstack-cell1-galera-0\" (UID: \"25ccf740-cc48-4863-8a7d-98548588860f\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.214507 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.215712 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.217713 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.217732 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.219876 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-fzh7s" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.240153 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.250914 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5" exitCode=0 Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.250973 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5"} Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.251013 4845 scope.go:117] "RemoveContainer" containerID="5c8a61ef5e1d6c97c545382d55b8a80c690bc952b158b0bc2a66b1f6b33d1ffd" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.320385 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.320485 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-config-data\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.321287 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-kolla-config\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.321598 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf97q\" (UniqueName: \"kubernetes.io/projected/b34640b6-49ff-4638-bde8-1bc32e658907-kube-api-access-hf97q\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.321635 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.369647 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.423652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.423790 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.423840 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-config-data\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.423900 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-kolla-config\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.423925 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf97q\" (UniqueName: \"kubernetes.io/projected/b34640b6-49ff-4638-bde8-1bc32e658907-kube-api-access-hf97q\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.425043 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-config-data\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.425140 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b34640b6-49ff-4638-bde8-1bc32e658907-kolla-config\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.438172 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.438251 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b34640b6-49ff-4638-bde8-1bc32e658907-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.453542 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf97q\" (UniqueName: \"kubernetes.io/projected/b34640b6-49ff-4638-bde8-1bc32e658907-kube-api-access-hf97q\") pod \"memcached-0\" (UID: \"b34640b6-49ff-4638-bde8-1bc32e658907\") " pod="openstack/memcached-0" Feb 02 10:50:47 crc kubenswrapper[4845]: I0202 10:50:47.548793 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:50:48 crc kubenswrapper[4845]: I0202 10:50:48.966056 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:50:48 crc kubenswrapper[4845]: I0202 10:50:48.970900 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:50:48 crc kubenswrapper[4845]: I0202 10:50:48.977962 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-g5w45" Feb 02 10:50:48 crc kubenswrapper[4845]: I0202 10:50:48.987463 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.062047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6hd\" (UniqueName: \"kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd\") pod \"kube-state-metrics-0\" (UID: \"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31\") " pod="openstack/kube-state-metrics-0" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.165101 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6hd\" (UniqueName: \"kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd\") pod \"kube-state-metrics-0\" (UID: \"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31\") " pod="openstack/kube-state-metrics-0" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.202617 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6hd\" (UniqueName: \"kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd\") pod \"kube-state-metrics-0\" (UID: \"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31\") " pod="openstack/kube-state-metrics-0" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.298326 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.783593 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl"] Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.785115 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.794170 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.794280 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-zqrg9" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.800831 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl"] Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.894947 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfh4\" (UniqueName: \"kubernetes.io/projected/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-kube-api-access-cvfh4\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.895322 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.997643 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:49 crc kubenswrapper[4845]: I0202 10:50:49.997752 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfh4\" (UniqueName: \"kubernetes.io/projected/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-kube-api-access-cvfh4\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.002724 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.027638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfh4\" (UniqueName: \"kubernetes.io/projected/0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b-kube-api-access-cvfh4\") pod \"observability-ui-dashboards-66cbf594b5-8x2hl\" (UID: \"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.125412 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.195454 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fd44458cd-cp9b7"] Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.196639 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.206386 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd44458cd-cp9b7"] Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.267934 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.270370 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.272516 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.272553 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.272577 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wp8jb" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.272516 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.273536 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.273699 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.273815 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.281411 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.287536 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.308964 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-service-ca\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-oauth-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309056 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-oauth-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309141 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khm9t\" (UniqueName: \"kubernetes.io/projected/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-kube-api-access-khm9t\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309169 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.309190 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-trusted-ca-bundle\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.411999 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-oauth-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5br\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412099 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412116 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-oauth-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412133 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412191 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412230 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412301 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khm9t\" (UniqueName: \"kubernetes.io/projected/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-kube-api-access-khm9t\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412345 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412382 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-trusted-ca-bundle\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412410 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412453 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412486 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412525 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412563 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-service-ca\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.412923 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.413369 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-service-ca\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.414010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-trusted-ca-bundle\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.414426 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-oauth-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.417357 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-oauth-config\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.419371 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-console-serving-cert\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.432000 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khm9t\" (UniqueName: \"kubernetes.io/projected/d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3-kube-api-access-khm9t\") pod \"console-6fd44458cd-cp9b7\" (UID: \"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3\") " pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.515827 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.516145 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.516625 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.516712 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.516847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md5br\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.516951 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.517042 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.517118 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.517201 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.517301 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.521412 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.521489 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.521813 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.523105 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.523386 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.523412 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b560f795087ddb8e1c0fbe0076d2f0e9dba0d3739abc904f350829f75b851b7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.524083 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.526338 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.527701 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.527735 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.530373 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.534351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md5br\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.567093 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:50 crc kubenswrapper[4845]: I0202 10:50:50.588949 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.471015 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.475681 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.478781 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.479954 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.480081 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.480142 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.480192 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dwn89" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.480362 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.524025 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tt4db"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.525288 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.528317 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.528514 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-49vj7" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.528689 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.532324 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tt4db"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.541612 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9qwr2"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.544687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.549034 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9qwr2"] Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.579170 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.579341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.579436 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.581019 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.581362 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.581645 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.581721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.581752 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7xb\" (UniqueName: \"kubernetes.io/projected/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-kube-api-access-wc7xb\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683688 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683742 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683781 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683800 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9zzg\" (UniqueName: \"kubernetes.io/projected/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-kube-api-access-j9zzg\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683822 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7xb\" (UniqueName: \"kubernetes.io/projected/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-kube-api-access-wc7xb\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683856 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-run\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683880 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683918 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72da7703-b176-47cb-953e-de037d663c55-scripts\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683942 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-log\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683970 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.683990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684005 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684020 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-combined-ca-bundle\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684045 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-lib\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684064 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684085 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p245\" (UniqueName: \"kubernetes.io/projected/72da7703-b176-47cb-953e-de037d663c55-kube-api-access-2p245\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684146 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-ovn-controller-tls-certs\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684167 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-etc-ovs\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684183 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-scripts\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.684204 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-log-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.685433 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.686067 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-config\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.686718 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.688704 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.688725 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d0d0596cc78b094b683b64f226aabe0434fe7e7a7a678c91f7b419e7a94390d1/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.689729 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.702850 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.703621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.705307 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7xb\" (UniqueName: \"kubernetes.io/projected/bd4a7449-0e37-44e1-9f01-bb1a336cb8cd-kube-api-access-wc7xb\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.740535 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-884e4ed0-8abb-4b74-99c7-8a8968ac54b5\") pod \"ovsdbserver-nb-0\" (UID: \"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.786275 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72da7703-b176-47cb-953e-de037d663c55-scripts\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.786665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-log\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.786819 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-combined-ca-bundle\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787015 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-lib\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787144 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787263 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787352 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p245\" (UniqueName: \"kubernetes.io/projected/72da7703-b176-47cb-953e-de037d663c55-kube-api-access-2p245\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787451 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-ovn-controller-tls-certs\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787530 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-etc-ovs\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787221 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-log\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787688 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-scripts\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787809 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-log-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.787869 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-lib\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788050 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9zzg\" (UniqueName: \"kubernetes.io/projected/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-kube-api-access-j9zzg\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788164 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-run\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788398 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-log-ovn\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788457 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/72da7703-b176-47cb-953e-de037d663c55-scripts\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788465 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/72da7703-b176-47cb-953e-de037d663c55-var-run\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788575 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-var-run\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.788601 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-etc-ovs\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.790726 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-combined-ca-bundle\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.792022 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-scripts\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.797712 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/72da7703-b176-47cb-953e-de037d663c55-ovn-controller-tls-certs\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.808259 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p245\" (UniqueName: \"kubernetes.io/projected/72da7703-b176-47cb-953e-de037d663c55-kube-api-access-2p245\") pod \"ovn-controller-tt4db\" (UID: \"72da7703-b176-47cb-953e-de037d663c55\") " pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.813866 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.817592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9zzg\" (UniqueName: \"kubernetes.io/projected/4f430e6a-b6ca-42b5-bb37-e5104bba0bd1-kube-api-access-j9zzg\") pod \"ovn-controller-ovs-9qwr2\" (UID: \"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1\") " pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.846433 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db" Feb 02 10:50:53 crc kubenswrapper[4845]: I0202 10:50:53.868837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.325589 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.327558 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.333438 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-glfsq" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.333801 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.333994 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.338683 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.341496 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447198 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447250 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447284 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447362 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447385 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8g9\" (UniqueName: \"kubernetes.io/projected/67a51964-326b-42cd-8055-0822d42557f7-kube-api-access-zl8g9\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447417 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8747bc56-b51a-4599-a12b-1803202b49b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8747bc56-b51a-4599-a12b-1803202b49b7\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447443 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.447577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a51964-326b-42cd-8055-0822d42557f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a51964-326b-42cd-8055-0822d42557f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549661 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549686 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549713 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549779 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549801 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8g9\" (UniqueName: \"kubernetes.io/projected/67a51964-326b-42cd-8055-0822d42557f7-kube-api-access-zl8g9\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549833 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8747bc56-b51a-4599-a12b-1803202b49b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8747bc56-b51a-4599-a12b-1803202b49b7\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.549864 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.551038 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67a51964-326b-42cd-8055-0822d42557f7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.553453 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-config\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.553546 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67a51964-326b-42cd-8055-0822d42557f7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.555452 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.556946 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.557879 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67a51964-326b-42cd-8055-0822d42557f7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.558263 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.558299 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8747bc56-b51a-4599-a12b-1803202b49b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8747bc56-b51a-4599-a12b-1803202b49b7\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/30b318debfe56b2d398558b7756ea4b3cc9937d15d6788c6c495212b4197659e/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.572468 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8g9\" (UniqueName: \"kubernetes.io/projected/67a51964-326b-42cd-8055-0822d42557f7-kube-api-access-zl8g9\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.603110 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8747bc56-b51a-4599-a12b-1803202b49b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8747bc56-b51a-4599-a12b-1803202b49b7\") pod \"ovsdbserver-sb-0\" (UID: \"67a51964-326b-42cd-8055-0822d42557f7\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:56 crc kubenswrapper[4845]: I0202 10:50:56.654138 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:51:01 crc kubenswrapper[4845]: E0202 10:51:01.854210 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:51:01 crc kubenswrapper[4845]: E0202 10:51:01.854657 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6dgn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-j9fpl_openstack(a5ff22c8-ded6-4209-9503-f1e66526c1d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:51:01 crc kubenswrapper[4845]: E0202 10:51:01.855877 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" podUID="a5ff22c8-ded6-4209-9503-f1e66526c1d5" Feb 02 10:51:02 crc kubenswrapper[4845]: E0202 10:51:02.922393 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 02 10:51:02 crc kubenswrapper[4845]: E0202 10:51:02.922574 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwxqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pxsn7_openstack(b15a2eeb-8248-40e0-b9a6-294ed99f1177): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:51:02 crc kubenswrapper[4845]: E0202 10:51:02.923751 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" podUID="b15a2eeb-8248-40e0-b9a6-294ed99f1177" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.116284 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.188295 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6dgn\" (UniqueName: \"kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn\") pod \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.188777 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config\") pod \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\" (UID: \"a5ff22c8-ded6-4209-9503-f1e66526c1d5\") " Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.189331 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config" (OuterVolumeSpecName: "config") pod "a5ff22c8-ded6-4209-9503-f1e66526c1d5" (UID: "a5ff22c8-ded6-4209-9503-f1e66526c1d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.189958 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ff22c8-ded6-4209-9503-f1e66526c1d5-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.197658 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn" (OuterVolumeSpecName: "kube-api-access-h6dgn") pod "a5ff22c8-ded6-4209-9503-f1e66526c1d5" (UID: "a5ff22c8-ded6-4209-9503-f1e66526c1d5"). InnerVolumeSpecName "kube-api-access-h6dgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.293381 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6dgn\" (UniqueName: \"kubernetes.io/projected/a5ff22c8-ded6-4209-9503-f1e66526c1d5-kube-api-access-h6dgn\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.381933 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:51:03 crc kubenswrapper[4845]: W0202 10:51:03.416216 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25ccf740_cc48_4863_8a7d_98548588860f.slice/crio-c4a023146aca1659ede14943dfd5235cc6dcd47c52eb513fd17537bff2089179 WatchSource:0}: Error finding container c4a023146aca1659ede14943dfd5235cc6dcd47c52eb513fd17537bff2089179: Status 404 returned error can't find the container with id c4a023146aca1659ede14943dfd5235cc6dcd47c52eb513fd17537bff2089179 Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.500976 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"25ccf740-cc48-4863-8a7d-98548588860f","Type":"ContainerStarted","Data":"c4a023146aca1659ede14943dfd5235cc6dcd47c52eb513fd17537bff2089179"} Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.502045 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" event={"ID":"a5ff22c8-ded6-4209-9503-f1e66526c1d5","Type":"ContainerDied","Data":"3feed3c8637a37b93c2f596a1883e1905a851bfbc1c2fe28022e86b8367a3d6d"} Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.502116 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fpl" Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.601540 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.626856 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fpl"] Feb 02 10:51:03 crc kubenswrapper[4845]: I0202 10:51:03.731448 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ff22c8-ded6-4209-9503-f1e66526c1d5" path="/var/lib/kubelet/pods/a5ff22c8-ded6-4209-9503-f1e66526c1d5/volumes" Feb 02 10:51:04 crc kubenswrapper[4845]: W0202 10:51:04.139228 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c7d4707_dfce_464f_bffe_0d543bea6299.slice/crio-45fa2ff82f9ba3fc9efe2a936e5059929a453bdaa544842bd03722f59f5c4e39 WatchSource:0}: Error finding container 45fa2ff82f9ba3fc9efe2a936e5059929a453bdaa544842bd03722f59f5c4e39: Status 404 returned error can't find the container with id 45fa2ff82f9ba3fc9efe2a936e5059929a453bdaa544842bd03722f59f5c4e39 Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.144808 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.190142 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.203719 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tt4db"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.429300 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl"] Feb 02 10:51:04 crc kubenswrapper[4845]: W0202 10:51:04.440521 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34640b6_49ff_4638_bde8_1bc32e658907.slice/crio-411829893a7b733a7a95d161390de8ff280d723162448f3b6e737696178e1fb9 WatchSource:0}: Error finding container 411829893a7b733a7a95d161390de8ff280d723162448f3b6e737696178e1fb9: Status 404 returned error can't find the container with id 411829893a7b733a7a95d161390de8ff280d723162448f3b6e737696178e1fb9 Feb 02 10:51:04 crc kubenswrapper[4845]: W0202 10:51:04.450299 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bd36a22_def8_4ad5_b1b2_ac23ef1ea70b.slice/crio-c77dfff68d255c219e308dea922d30a44337fe550725d5f01e63c26e179fea66 WatchSource:0}: Error finding container c77dfff68d255c219e308dea922d30a44337fe550725d5f01e63c26e179fea66: Status 404 returned error can't find the container with id c77dfff68d255c219e308dea922d30a44337fe550725d5f01e63c26e179fea66 Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.527447 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c7d4707-dfce-464f-bffe-0d543bea6299","Type":"ContainerStarted","Data":"45fa2ff82f9ba3fc9efe2a936e5059929a453bdaa544842bd03722f59f5c4e39"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.530686 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" event={"ID":"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b","Type":"ContainerStarted","Data":"c77dfff68d255c219e308dea922d30a44337fe550725d5f01e63c26e179fea66"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.533920 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db" event={"ID":"72da7703-b176-47cb-953e-de037d663c55","Type":"ContainerStarted","Data":"0508e98d889b1df3d4d2d7fdd99890c969a037aa3b75615bdf7a584fc4c78fe5"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.542823 4845 generic.go:334] "Generic (PLEG): container finished" podID="511adc55-f919-42e9-961d-94565550d668" containerID="56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c" exitCode=0 Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.542930 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" event={"ID":"511adc55-f919-42e9-961d-94565550d668","Type":"ContainerDied","Data":"56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.544570 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" event={"ID":"b15a2eeb-8248-40e0-b9a6-294ed99f1177","Type":"ContainerDied","Data":"541d01d5e724cfd983149b2a3df13b41fdb7aca180b15b693908ce4201c3363e"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.544614 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="541d01d5e724cfd983149b2a3df13b41fdb7aca180b15b693908ce4201c3363e" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.554583 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.562373 4845 generic.go:334] "Generic (PLEG): container finished" podID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerID="932d8191c1cbfeca1c9bdbbf6bdc47b46318b7170046d467a566c55700027df8" exitCode=0 Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.562834 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" event={"ID":"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e","Type":"ContainerDied","Data":"932d8191c1cbfeca1c9bdbbf6bdc47b46318b7170046d467a566c55700027df8"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.570411 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b34640b6-49ff-4638-bde8-1bc32e658907","Type":"ContainerStarted","Data":"411829893a7b733a7a95d161390de8ff280d723162448f3b6e737696178e1fb9"} Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.571583 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fd44458cd-cp9b7"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.587189 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.672101 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.722287 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc\") pod \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.723001 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config\") pod \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.724322 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwxqg\" (UniqueName: \"kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg\") pod \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\" (UID: \"b15a2eeb-8248-40e0-b9a6-294ed99f1177\") " Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.727355 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b15a2eeb-8248-40e0-b9a6-294ed99f1177" (UID: "b15a2eeb-8248-40e0-b9a6-294ed99f1177"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.727673 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config" (OuterVolumeSpecName: "config") pod "b15a2eeb-8248-40e0-b9a6-294ed99f1177" (UID: "b15a2eeb-8248-40e0-b9a6-294ed99f1177"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.770506 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg" (OuterVolumeSpecName: "kube-api-access-cwxqg") pod "b15a2eeb-8248-40e0-b9a6-294ed99f1177" (UID: "b15a2eeb-8248-40e0-b9a6-294ed99f1177"). InnerVolumeSpecName "kube-api-access-cwxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.833969 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.834015 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwxqg\" (UniqueName: \"kubernetes.io/projected/b15a2eeb-8248-40e0-b9a6-294ed99f1177-kube-api-access-cwxqg\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.834031 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b15a2eeb-8248-40e0-b9a6-294ed99f1177-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.852515 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:51:04 crc kubenswrapper[4845]: W0202 10:51:04.870626 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7c0daea_a5c7_4695_bd4f_ad9a3aaf7d31.slice/crio-e2f9a94d7e921b062f55ff2afa3f02c589bdda200432ed3b8b900c15b062f04e WatchSource:0}: Error finding container e2f9a94d7e921b062f55ff2afa3f02c589bdda200432ed3b8b900c15b062f04e: Status 404 returned error can't find the container with id e2f9a94d7e921b062f55ff2afa3f02c589bdda200432ed3b8b900c15b062f04e Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.879432 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:51:04 crc kubenswrapper[4845]: I0202 10:51:04.983209 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.079844 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9qwr2"] Feb 02 10:51:05 crc kubenswrapper[4845]: W0202 10:51:05.102465 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f430e6a_b6ca_42b5_bb37_e5104bba0bd1.slice/crio-2e5190a29ed4be804089bade4de5c8fb5a3f27a51fe28535025b80834854824c WatchSource:0}: Error finding container 2e5190a29ed4be804089bade4de5c8fb5a3f27a51fe28535025b80834854824c: Status 404 returned error can't find the container with id 2e5190a29ed4be804089bade4de5c8fb5a3f27a51fe28535025b80834854824c Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.585510 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5","Type":"ContainerStarted","Data":"b3ca7a9a30c5133d142f8240cf49d184ac360dc0f74e089c87c96d3a92c7d96d"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.590086 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.593392 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" event={"ID":"511adc55-f919-42e9-961d-94565550d668","Type":"ContainerStarted","Data":"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.594142 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.596380 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c","Type":"ContainerStarted","Data":"16a87dbd784c474f3d4b29bfb2f9739515e8d502ec347cc5ddd63af0721bc8af"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.606091 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd44458cd-cp9b7" event={"ID":"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3","Type":"ContainerStarted","Data":"a8d174f01b9c46523f1f8f4592326ffbc634985d50e21741e0617cfbbbfb4dc8"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.606139 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fd44458cd-cp9b7" event={"ID":"d722cd7f-7c4d-4062-8aeb-ff1a8b8c6cb3","Type":"ContainerStarted","Data":"98cceb545a8cf0dd260405b8d67f50cec74f748ea790ab4058c268d633c80df0"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.608080 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerStarted","Data":"f66ab08a88fdf01ed8eac1ea6cefb40d4702621c1aec3526c050777cfd6e0be7"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.613993 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31","Type":"ContainerStarted","Data":"e2f9a94d7e921b062f55ff2afa3f02c589bdda200432ed3b8b900c15b062f04e"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.623359 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qwr2" event={"ID":"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1","Type":"ContainerStarted","Data":"2e5190a29ed4be804089bade4de5c8fb5a3f27a51fe28535025b80834854824c"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.625465 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd","Type":"ContainerStarted","Data":"95438a6d7fca85c477f7c0194280c32e6fdf5c2cf8a4182b711e5fb0c2b63950"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.626576 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a51964-326b-42cd-8055-0822d42557f7","Type":"ContainerStarted","Data":"6df2ca22d7d94445a4e933111b388c806c2aac1ab6d0007aadbbd463ad1bd576"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.643627 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d0a3a285-364a-4df2-8a7c-947ff673f254","Type":"ContainerStarted","Data":"93c6ac3518c8da645fe78eb56c15a31adc12e1d3538e14b7359e676cb11918c9"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.645578 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" podStartSLOduration=4.024834767 podStartE2EDuration="23.645552394s" podCreationTimestamp="2026-02-02 10:50:42 +0000 UTC" firstStartedPulling="2026-02-02 10:50:43.609465266 +0000 UTC m=+1124.700866716" lastFinishedPulling="2026-02-02 10:51:03.230182893 +0000 UTC m=+1144.321584343" observedRunningTime="2026-02-02 10:51:05.640177262 +0000 UTC m=+1146.731578712" watchObservedRunningTime="2026-02-02 10:51:05.645552394 +0000 UTC m=+1146.736953844" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.653133 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pxsn7" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.653127 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a61fa08e-868a-4415-88d5-7ed0eebbeb45","Type":"ContainerStarted","Data":"687c373278d72892bc7af53f43d957e03afb3a8da90e855595ea0df53482a4a4"} Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.687566 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" podStartSLOduration=4.279756574 podStartE2EDuration="23.687538191s" podCreationTimestamp="2026-02-02 10:50:42 +0000 UTC" firstStartedPulling="2026-02-02 10:50:43.91517315 +0000 UTC m=+1125.006574600" lastFinishedPulling="2026-02-02 10:51:03.322954777 +0000 UTC m=+1144.414356217" observedRunningTime="2026-02-02 10:51:05.663753828 +0000 UTC m=+1146.755155278" watchObservedRunningTime="2026-02-02 10:51:05.687538191 +0000 UTC m=+1146.778939641" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.731559 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fd44458cd-cp9b7" podStartSLOduration=15.731533435 podStartE2EDuration="15.731533435s" podCreationTimestamp="2026-02-02 10:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:05.718168797 +0000 UTC m=+1146.809570247" watchObservedRunningTime="2026-02-02 10:51:05.731533435 +0000 UTC m=+1146.822934885" Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.808925 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:51:05 crc kubenswrapper[4845]: I0202 10:51:05.824281 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pxsn7"] Feb 02 10:51:06 crc kubenswrapper[4845]: I0202 10:51:06.665715 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" event={"ID":"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e","Type":"ContainerStarted","Data":"81b2eedcdbc73132319f670807bdc308fb485bc4c468651b7572510bbe0cf821"} Feb 02 10:51:07 crc kubenswrapper[4845]: I0202 10:51:07.733542 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b15a2eeb-8248-40e0-b9a6-294ed99f1177" path="/var/lib/kubelet/pods/b15a2eeb-8248-40e0-b9a6-294ed99f1177/volumes" Feb 02 10:51:10 crc kubenswrapper[4845]: I0202 10:51:10.524848 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:51:10 crc kubenswrapper[4845]: I0202 10:51:10.525223 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:51:10 crc kubenswrapper[4845]: I0202 10:51:10.530687 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:51:10 crc kubenswrapper[4845]: I0202 10:51:10.712568 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fd44458cd-cp9b7" Feb 02 10:51:10 crc kubenswrapper[4845]: I0202 10:51:10.800598 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:51:13 crc kubenswrapper[4845]: I0202 10:51:13.016391 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:51:13 crc kubenswrapper[4845]: I0202 10:51:13.364070 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:51:13 crc kubenswrapper[4845]: I0202 10:51:13.428739 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:51:13 crc kubenswrapper[4845]: I0202 10:51:13.735709 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="dnsmasq-dns" containerID="cri-o://9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373" gracePeriod=10 Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.635765 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.747997 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd","Type":"ContainerStarted","Data":"8cd44ac549a9b7bbdf2a3f50bece178a504d7986304b63eb9f84ea47478bdd2b"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.749143 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a51964-326b-42cd-8055-0822d42557f7","Type":"ContainerStarted","Data":"d17f965a53a332e865cdd52457dfdc1cc0e391ee5f5094a087a7141a6440ebfe"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.750117 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b34640b6-49ff-4638-bde8-1bc32e658907","Type":"ContainerStarted","Data":"a4cca1551b380ad6feeb160bd7b90b0e4b4dddb01e1cf77ece2fb25e0c5b17ad"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.750213 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.752064 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" event={"ID":"0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b","Type":"ContainerStarted","Data":"ed8bcf9838b5585148c683265901fc500970a70f9c0152f854d83a9686a534d1"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.754220 4845 generic.go:334] "Generic (PLEG): container finished" podID="511adc55-f919-42e9-961d-94565550d668" containerID="9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373" exitCode=0 Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.754255 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" event={"ID":"511adc55-f919-42e9-961d-94565550d668","Type":"ContainerDied","Data":"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.754279 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.754292 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-d26cn" event={"ID":"511adc55-f919-42e9-961d-94565550d668","Type":"ContainerDied","Data":"d2ea1511585c52b6d62ad84745c6c0e0adbcd9ee5f53b0b69c86b9217ec09f38"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.754310 4845 scope.go:117] "RemoveContainer" containerID="9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.755946 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qwr2" event={"ID":"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1","Type":"ContainerStarted","Data":"f31c8b5c6428576002537de155a0271f970a073899cc982104158309064e40b1"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.757751 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"25ccf740-cc48-4863-8a7d-98548588860f","Type":"ContainerStarted","Data":"faa9153539236c39a3f27d3d9abf674976b87091642cf826d02bd0acfc5c4742"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.759817 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31","Type":"ContainerStarted","Data":"cacef27d0dbc8696a18778de5ec2dfe7815ec16e5c7a5beb939fff7f7c4e5a61"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.759979 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.762707 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c7d4707-dfce-464f-bffe-0d543bea6299","Type":"ContainerStarted","Data":"5758beae08428aed0b9ac178c5180e65e78f2dc8d7e70739c722b61d04e15365"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.768047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db" event={"ID":"72da7703-b176-47cb-953e-de037d663c55","Type":"ContainerStarted","Data":"f23f3930f4c9ee89bf618036dd5d5758a01eef56ce7dab5c6abb14874ac1a05c"} Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.768197 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-tt4db" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.772951 4845 scope.go:117] "RemoveContainer" containerID="56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.778612 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.814882081 podStartE2EDuration="27.778593614s" podCreationTimestamp="2026-02-02 10:50:47 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.444229674 +0000 UTC m=+1145.535631124" lastFinishedPulling="2026-02-02 10:51:12.407941207 +0000 UTC m=+1153.499342657" observedRunningTime="2026-02-02 10:51:14.777145333 +0000 UTC m=+1155.868546783" watchObservedRunningTime="2026-02-02 10:51:14.778593614 +0000 UTC m=+1155.869995064" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.799041 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc\") pod \"511adc55-f919-42e9-961d-94565550d668\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.799161 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ktfp\" (UniqueName: \"kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp\") pod \"511adc55-f919-42e9-961d-94565550d668\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.799840 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config\") pod \"511adc55-f919-42e9-961d-94565550d668\" (UID: \"511adc55-f919-42e9-961d-94565550d668\") " Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.806419 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp" (OuterVolumeSpecName: "kube-api-access-4ktfp") pod "511adc55-f919-42e9-961d-94565550d668" (UID: "511adc55-f919-42e9-961d-94565550d668"). InnerVolumeSpecName "kube-api-access-4ktfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.808718 4845 scope.go:117] "RemoveContainer" containerID="9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373" Feb 02 10:51:14 crc kubenswrapper[4845]: E0202 10:51:14.809441 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373\": container with ID starting with 9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373 not found: ID does not exist" containerID="9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.809492 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373"} err="failed to get container status \"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373\": rpc error: code = NotFound desc = could not find container \"9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373\": container with ID starting with 9cc71f92804d6cd2b52fa4cd13c0548baa684d13d966ca40265d2663512ea373 not found: ID does not exist" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.809511 4845 scope.go:117] "RemoveContainer" containerID="56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c" Feb 02 10:51:14 crc kubenswrapper[4845]: E0202 10:51:14.811236 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c\": container with ID starting with 56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c not found: ID does not exist" containerID="56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.811281 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c"} err="failed to get container status \"56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c\": rpc error: code = NotFound desc = could not find container \"56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c\": container with ID starting with 56734a774259fb63015cd94920845626bdeb21ac388842f459cf943fcd96e34c not found: ID does not exist" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.849421 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tt4db" podStartSLOduration=13.629043785 podStartE2EDuration="21.849393906s" podCreationTimestamp="2026-02-02 10:50:53 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.451503959 +0000 UTC m=+1145.542905409" lastFinishedPulling="2026-02-02 10:51:12.67185408 +0000 UTC m=+1153.763255530" observedRunningTime="2026-02-02 10:51:14.843714625 +0000 UTC m=+1155.935116075" watchObservedRunningTime="2026-02-02 10:51:14.849393906 +0000 UTC m=+1155.940795376" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.875772 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.436361357 podStartE2EDuration="26.875736991s" podCreationTimestamp="2026-02-02 10:50:48 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.907415751 +0000 UTC m=+1145.998817201" lastFinishedPulling="2026-02-02 10:51:13.346791385 +0000 UTC m=+1154.438192835" observedRunningTime="2026-02-02 10:51:14.873308702 +0000 UTC m=+1155.964710172" watchObservedRunningTime="2026-02-02 10:51:14.875736991 +0000 UTC m=+1155.967138441" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.903461 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ktfp\" (UniqueName: \"kubernetes.io/projected/511adc55-f919-42e9-961d-94565550d668-kube-api-access-4ktfp\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:14 crc kubenswrapper[4845]: I0202 10:51:14.948392 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-8x2hl" podStartSLOduration=17.806125183 podStartE2EDuration="25.948342724s" podCreationTimestamp="2026-02-02 10:50:49 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.455516353 +0000 UTC m=+1145.546917803" lastFinishedPulling="2026-02-02 10:51:12.597733904 +0000 UTC m=+1153.689135344" observedRunningTime="2026-02-02 10:51:14.939324799 +0000 UTC m=+1156.030726249" watchObservedRunningTime="2026-02-02 10:51:14.948342724 +0000 UTC m=+1156.039744194" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.056293 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "511adc55-f919-42e9-961d-94565550d668" (UID: "511adc55-f919-42e9-961d-94565550d668"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.107667 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.159318 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config" (OuterVolumeSpecName: "config") pod "511adc55-f919-42e9-961d-94565550d668" (UID: "511adc55-f919-42e9-961d-94565550d668"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.209460 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/511adc55-f919-42e9-961d-94565550d668-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.600743 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.608061 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-d26cn"] Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.723573 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="511adc55-f919-42e9-961d-94565550d668" path="/var/lib/kubelet/pods/511adc55-f919-42e9-961d-94565550d668/volumes" Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.778791 4845 generic.go:334] "Generic (PLEG): container finished" podID="4f430e6a-b6ca-42b5-bb37-e5104bba0bd1" containerID="f31c8b5c6428576002537de155a0271f970a073899cc982104158309064e40b1" exitCode=0 Feb 02 10:51:15 crc kubenswrapper[4845]: I0202 10:51:15.778844 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qwr2" event={"ID":"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1","Type":"ContainerDied","Data":"f31c8b5c6428576002537de155a0271f970a073899cc982104158309064e40b1"} Feb 02 10:51:19 crc kubenswrapper[4845]: I0202 10:51:19.305571 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:51:19 crc kubenswrapper[4845]: I0202 10:51:19.826490 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerStarted","Data":"f241191074a6c7fafb8932f36a972acd0bd84c8ac50c73e751b3fdd46aa2e817"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.843538 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bd4a7449-0e37-44e1-9f01-bb1a336cb8cd","Type":"ContainerStarted","Data":"6eed53d7553fe40b78e6d4f2f944d8588f1d87db411939c19d3f41264a40d239"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.846338 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"67a51964-326b-42cd-8055-0822d42557f7","Type":"ContainerStarted","Data":"921d37cec6df5cd7bc740716d3628321e09070d72288a4d06af5c0cce717e5d0"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.851362 4845 generic.go:334] "Generic (PLEG): container finished" podID="0c7d4707-dfce-464f-bffe-0d543bea6299" containerID="5758beae08428aed0b9ac178c5180e65e78f2dc8d7e70739c722b61d04e15365" exitCode=0 Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.851429 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c7d4707-dfce-464f-bffe-0d543bea6299","Type":"ContainerDied","Data":"5758beae08428aed0b9ac178c5180e65e78f2dc8d7e70739c722b61d04e15365"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.856429 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qwr2" event={"ID":"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1","Type":"ContainerStarted","Data":"68007ec0b67118411d1179cf40cafae36ac545ed762a766b8a9c84ebacbf237c"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.856476 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qwr2" event={"ID":"4f430e6a-b6ca-42b5-bb37-e5104bba0bd1","Type":"ContainerStarted","Data":"56559a4056310294c44c018bc2781b154117e8dd2aa24052e502fa3fada6fccf"} Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.857499 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.857746 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.873217 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.71439316 podStartE2EDuration="28.873198593s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.944722396 +0000 UTC m=+1146.036123846" lastFinishedPulling="2026-02-02 10:51:20.103527829 +0000 UTC m=+1161.194929279" observedRunningTime="2026-02-02 10:51:20.872262507 +0000 UTC m=+1161.963663957" watchObservedRunningTime="2026-02-02 10:51:20.873198593 +0000 UTC m=+1161.964600053" Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.963274 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9qwr2" podStartSLOduration=20.547287455 podStartE2EDuration="27.96325669s" podCreationTimestamp="2026-02-02 10:50:53 +0000 UTC" firstStartedPulling="2026-02-02 10:51:05.1057796 +0000 UTC m=+1146.197181050" lastFinishedPulling="2026-02-02 10:51:12.521748835 +0000 UTC m=+1153.613150285" observedRunningTime="2026-02-02 10:51:20.95158709 +0000 UTC m=+1162.042988540" watchObservedRunningTime="2026-02-02 10:51:20.96325669 +0000 UTC m=+1162.054658140" Feb 02 10:51:20 crc kubenswrapper[4845]: I0202 10:51:20.989164 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.884865431 podStartE2EDuration="25.989138122s" podCreationTimestamp="2026-02-02 10:50:55 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.99121788 +0000 UTC m=+1146.082619330" lastFinishedPulling="2026-02-02 10:51:20.095490571 +0000 UTC m=+1161.186892021" observedRunningTime="2026-02-02 10:51:20.978313066 +0000 UTC m=+1162.069714536" watchObservedRunningTime="2026-02-02 10:51:20.989138122 +0000 UTC m=+1162.080539572" Feb 02 10:51:21 crc kubenswrapper[4845]: I0202 10:51:21.654856 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 02 10:51:21 crc kubenswrapper[4845]: I0202 10:51:21.870931 4845 generic.go:334] "Generic (PLEG): container finished" podID="25ccf740-cc48-4863-8a7d-98548588860f" containerID="faa9153539236c39a3f27d3d9abf674976b87091642cf826d02bd0acfc5c4742" exitCode=0 Feb 02 10:51:21 crc kubenswrapper[4845]: I0202 10:51:21.871004 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"25ccf740-cc48-4863-8a7d-98548588860f","Type":"ContainerDied","Data":"faa9153539236c39a3f27d3d9abf674976b87091642cf826d02bd0acfc5c4742"} Feb 02 10:51:21 crc kubenswrapper[4845]: I0202 10:51:21.874456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0c7d4707-dfce-464f-bffe-0d543bea6299","Type":"ContainerStarted","Data":"befdcfa987dede4011ada4e909e3925039505d823a86f1056236448d85462fca"} Feb 02 10:51:21 crc kubenswrapper[4845]: I0202 10:51:21.923788 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.394425503 podStartE2EDuration="37.923764611s" podCreationTimestamp="2026-02-02 10:50:44 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.14245677 +0000 UTC m=+1145.233858220" lastFinishedPulling="2026-02-02 10:51:12.671795878 +0000 UTC m=+1153.763197328" observedRunningTime="2026-02-02 10:51:21.922475014 +0000 UTC m=+1163.013876464" watchObservedRunningTime="2026-02-02 10:51:21.923764611 +0000 UTC m=+1163.015166071" Feb 02 10:51:22 crc kubenswrapper[4845]: I0202 10:51:22.550564 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 02 10:51:22 crc kubenswrapper[4845]: I0202 10:51:22.883421 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"25ccf740-cc48-4863-8a7d-98548588860f","Type":"ContainerStarted","Data":"57b3313ac087a2759c26c8780af3f5784d0b7477c664de8d830ec6f84c57774e"} Feb 02 10:51:22 crc kubenswrapper[4845]: I0202 10:51:22.905435 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.933759984 podStartE2EDuration="37.905412229s" podCreationTimestamp="2026-02-02 10:50:45 +0000 UTC" firstStartedPulling="2026-02-02 10:51:03.436295882 +0000 UTC m=+1144.527697332" lastFinishedPulling="2026-02-02 10:51:12.407948127 +0000 UTC m=+1153.499349577" observedRunningTime="2026-02-02 10:51:22.899935034 +0000 UTC m=+1163.991336504" watchObservedRunningTime="2026-02-02 10:51:22.905412229 +0000 UTC m=+1163.996813679" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.655413 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.694512 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.814859 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.814942 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.854205 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.926984 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 02 10:51:23 crc kubenswrapper[4845]: I0202 10:51:23.932930 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.228340 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:24 crc kubenswrapper[4845]: E0202 10:51:24.228763 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="init" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.228782 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="init" Feb 02 10:51:24 crc kubenswrapper[4845]: E0202 10:51:24.228813 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="dnsmasq-dns" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.228819 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="dnsmasq-dns" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.233984 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="511adc55-f919-42e9-961d-94565550d668" containerName="dnsmasq-dns" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.235204 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.242023 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.243648 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.320569 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlrw\" (UniqueName: \"kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.320623 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.320661 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.320725 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.354857 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mqgrd"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.364608 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.371348 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.383065 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mqgrd"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.424692 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlrw\" (UniqueName: \"kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.424773 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.424848 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.424931 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.426681 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.427565 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.429976 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.474393 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlrw\" (UniqueName: \"kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw\") pod \"dnsmasq-dns-7fd796d7df-l9ddk\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.503169 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.504081 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.526659 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7pb2\" (UniqueName: \"kubernetes.io/projected/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-kube-api-access-x7pb2\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.527102 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovs-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.527210 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-config\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.527279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.527315 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-combined-ca-bundle\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.527360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovn-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.544358 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.554342 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.572923 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.630610 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7pb2\" (UniqueName: \"kubernetes.io/projected/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-kube-api-access-x7pb2\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.630739 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovs-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.630846 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-config\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.630926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.632143 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-combined-ca-bundle\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.632232 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovn-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.632625 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovn-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.633199 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-ovs-rundir\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.633877 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-config\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.645543 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.647435 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.652808 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-combined-ca-bundle\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.656285 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7pb2\" (UniqueName: \"kubernetes.io/projected/0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2-kube-api-access-x7pb2\") pod \"ovn-controller-metrics-mqgrd\" (UID: \"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2\") " pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.682321 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.684456 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.687490 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.687933 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.689005 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 02 10:51:24 crc kubenswrapper[4845]: I0202 10:51:24.689422 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-whxz4" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.693500 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mqgrd" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.696375 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.734298 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-297q5\" (UniqueName: \"kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.734655 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.734709 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.734734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.734760 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.836997 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j5d7\" (UniqueName: \"kubernetes.io/projected/53989098-3602-4958-96b3-ca7c539c29c9-kube-api-access-6j5d7\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837050 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53989098-3602-4958-96b3-ca7c539c29c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837069 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-config\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837099 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837126 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837147 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837187 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837268 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-297q5\" (UniqueName: \"kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837379 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-scripts\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.837442 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.838808 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.838797 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.838912 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.839467 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.863737 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-297q5\" (UniqueName: \"kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5\") pod \"dnsmasq-dns-86db49b7ff-gfr4t\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940581 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j5d7\" (UniqueName: \"kubernetes.io/projected/53989098-3602-4958-96b3-ca7c539c29c9-kube-api-access-6j5d7\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940635 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53989098-3602-4958-96b3-ca7c539c29c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940655 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-config\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940722 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940793 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940868 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.940923 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-scripts\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.941117 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/53989098-3602-4958-96b3-ca7c539c29c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.941598 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-config\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.941708 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53989098-3602-4958-96b3-ca7c539c29c9-scripts\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.946195 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.947785 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.955801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53989098-3602-4958-96b3-ca7c539c29c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:24.961122 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j5d7\" (UniqueName: \"kubernetes.io/projected/53989098-3602-4958-96b3-ca7c539c29c9-kube-api-access-6j5d7\") pod \"ovn-northd-0\" (UID: \"53989098-3602-4958-96b3-ca7c539c29c9\") " pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.043861 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.068539 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.734078 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.734647 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.817226 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.911063 4845 generic.go:334] "Generic (PLEG): container finished" podID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerID="f241191074a6c7fafb8932f36a972acd0bd84c8ac50c73e751b3fdd46aa2e817" exitCode=0 Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.911399 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerDied","Data":"f241191074a6c7fafb8932f36a972acd0bd84c8ac50c73e751b3fdd46aa2e817"} Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.923404 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.938511 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:25 crc kubenswrapper[4845]: W0202 10:51:25.947405 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47456531_c404_4086_89b2_d159d71fdeb1.slice/crio-51a381a3e9440b9cc201282058f7396573a6b9b5751fc506fe374daac0e1764f WatchSource:0}: Error finding container 51a381a3e9440b9cc201282058f7396573a6b9b5751fc506fe374daac0e1764f: Status 404 returned error can't find the container with id 51a381a3e9440b9cc201282058f7396573a6b9b5751fc506fe374daac0e1764f Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.957099 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:25 crc kubenswrapper[4845]: W0202 10:51:25.958634 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f152c8c_6cc4_4586_9fcb_c1ddee6e81d2.slice/crio-3d6b0f98f98cd18423fd63a1ce356c926c7fbd4cb2d243c8914bd4b2ad6b46b4 WatchSource:0}: Error finding container 3d6b0f98f98cd18423fd63a1ce356c926c7fbd4cb2d243c8914bd4b2ad6b46b4: Status 404 returned error can't find the container with id 3d6b0f98f98cd18423fd63a1ce356c926c7fbd4cb2d243c8914bd4b2ad6b46b4 Feb 02 10:51:25 crc kubenswrapper[4845]: I0202 10:51:25.970310 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mqgrd"] Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.030368 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.922479 4845 generic.go:334] "Generic (PLEG): container finished" podID="db65c5b8-1fc4-43f9-89bd-51ebb710eccd" containerID="faf3833dd05c13c51f3b521ae7124b1dde7b455aa0d1bcc6b64cb4774ec7cdfd" exitCode=0 Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.923012 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" event={"ID":"db65c5b8-1fc4-43f9-89bd-51ebb710eccd","Type":"ContainerDied","Data":"faf3833dd05c13c51f3b521ae7124b1dde7b455aa0d1bcc6b64cb4774ec7cdfd"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.923048 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" event={"ID":"db65c5b8-1fc4-43f9-89bd-51ebb710eccd","Type":"ContainerStarted","Data":"cd82bf219d534acecb1bc063cfd4353f103c02c9ba7678657bf6ce8f14343b3f"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.925140 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53989098-3602-4958-96b3-ca7c539c29c9","Type":"ContainerStarted","Data":"98a909ba82ca34783d904a217c691c1cca3ed276a0ab152887bf580551bfd39f"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.927759 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mqgrd" event={"ID":"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2","Type":"ContainerStarted","Data":"4fdf855357cf9832a13c007b3c91feb5764c38de34e1f8430a07358142f84beb"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.927831 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mqgrd" event={"ID":"0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2","Type":"ContainerStarted","Data":"3d6b0f98f98cd18423fd63a1ce356c926c7fbd4cb2d243c8914bd4b2ad6b46b4"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.935595 4845 generic.go:334] "Generic (PLEG): container finished" podID="47456531-c404-4086-89b2-d159d71fdeb1" containerID="30b7a3b349a6bebefe667d17eaae2b3cbdc973ed20f342ad0701818a0d0cb4b7" exitCode=0 Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.936337 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" event={"ID":"47456531-c404-4086-89b2-d159d71fdeb1","Type":"ContainerDied","Data":"30b7a3b349a6bebefe667d17eaae2b3cbdc973ed20f342ad0701818a0d0cb4b7"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.936393 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" event={"ID":"47456531-c404-4086-89b2-d159d71fdeb1","Type":"ContainerStarted","Data":"51a381a3e9440b9cc201282058f7396573a6b9b5751fc506fe374daac0e1764f"} Feb 02 10:51:26 crc kubenswrapper[4845]: I0202 10:51:26.994152 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mqgrd" podStartSLOduration=2.9941369780000002 podStartE2EDuration="2.994136978s" podCreationTimestamp="2026-02-02 10:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:26.979526941 +0000 UTC m=+1168.070928391" watchObservedRunningTime="2026-02-02 10:51:26.994136978 +0000 UTC m=+1168.085538428" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.197436 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6d20-account-create-update-8zrt2"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.198733 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.210573 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.233017 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d20-account-create-update-8zrt2"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.266171 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q8crj"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.267537 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.276074 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8crj"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.297097 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cd5\" (UniqueName: \"kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.297166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.371949 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.372263 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.398333 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.398386 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cd5\" (UniqueName: \"kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.398427 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzb5t\" (UniqueName: \"kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.398558 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.399421 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.417776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cd5\" (UniqueName: \"kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5\") pod \"keystone-6d20-account-create-update-8zrt2\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.457468 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-hchq8"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.458832 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.470707 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hchq8"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.491265 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.500293 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.500577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6k7b\" (UniqueName: \"kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.500905 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.501025 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzb5t\" (UniqueName: \"kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.501698 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.521938 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzb5t\" (UniqueName: \"kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t\") pod \"keystone-db-create-q8crj\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.537636 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.566215 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-42da-account-create-update-dmqrb"] Feb 02 10:51:27 crc kubenswrapper[4845]: E0202 10:51:27.566632 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db65c5b8-1fc4-43f9-89bd-51ebb710eccd" containerName="init" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.566650 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="db65c5b8-1fc4-43f9-89bd-51ebb710eccd" containerName="init" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.566831 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="db65c5b8-1fc4-43f9-89bd-51ebb710eccd" containerName="init" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.567654 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.570730 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.581867 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-42da-account-create-update-dmqrb"] Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.589568 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602214 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config\") pod \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602290 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb\") pod \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602507 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlrw\" (UniqueName: \"kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw\") pod \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602564 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc\") pod \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\" (UID: \"db65c5b8-1fc4-43f9-89bd-51ebb710eccd\") " Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602834 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602891 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz65m\" (UniqueName: \"kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602928 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.602967 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6k7b\" (UniqueName: \"kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.603852 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.611079 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw" (OuterVolumeSpecName: "kube-api-access-jtlrw") pod "db65c5b8-1fc4-43f9-89bd-51ebb710eccd" (UID: "db65c5b8-1fc4-43f9-89bd-51ebb710eccd"). InnerVolumeSpecName "kube-api-access-jtlrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.622657 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6k7b\" (UniqueName: \"kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b\") pod \"placement-db-create-hchq8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.707677 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz65m\" (UniqueName: \"kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.707730 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.707943 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlrw\" (UniqueName: \"kubernetes.io/projected/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-kube-api-access-jtlrw\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.708851 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.711120 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config" (OuterVolumeSpecName: "config") pod "db65c5b8-1fc4-43f9-89bd-51ebb710eccd" (UID: "db65c5b8-1fc4-43f9-89bd-51ebb710eccd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.711281 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "db65c5b8-1fc4-43f9-89bd-51ebb710eccd" (UID: "db65c5b8-1fc4-43f9-89bd-51ebb710eccd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.735179 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz65m\" (UniqueName: \"kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m\") pod \"placement-42da-account-create-update-dmqrb\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.739283 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "db65c5b8-1fc4-43f9-89bd-51ebb710eccd" (UID: "db65c5b8-1fc4-43f9-89bd-51ebb710eccd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.810141 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.810450 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.810464 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/db65c5b8-1fc4-43f9-89bd-51ebb710eccd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.821729 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hchq8" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.828732 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.962008 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" event={"ID":"db65c5b8-1fc4-43f9-89bd-51ebb710eccd","Type":"ContainerDied","Data":"cd82bf219d534acecb1bc063cfd4353f103c02c9ba7678657bf6ce8f14343b3f"} Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.962080 4845 scope.go:117] "RemoveContainer" containerID="faf3833dd05c13c51f3b521ae7124b1dde7b455aa0d1bcc6b64cb4774ec7cdfd" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.962225 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-l9ddk" Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.987140 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" event={"ID":"47456531-c404-4086-89b2-d159d71fdeb1","Type":"ContainerStarted","Data":"807d2bbccada01aa21068396fdc8f2bcadc102372f4ecccdc24ca17e629e8a36"} Feb 02 10:51:27 crc kubenswrapper[4845]: I0202 10:51:27.987339 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.013467 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" podStartSLOduration=4.013438911 podStartE2EDuration="4.013438911s" podCreationTimestamp="2026-02-02 10:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:28.006923315 +0000 UTC m=+1169.098324765" watchObservedRunningTime="2026-02-02 10:51:28.013438911 +0000 UTC m=+1169.104840381" Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.066938 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.072967 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-l9ddk"] Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.173997 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8crj"] Feb 02 10:51:28 crc kubenswrapper[4845]: W0202 10:51:28.201310 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82c7eb05_f8ef_40a5_b799_af8bfdfd9c4e.slice/crio-73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035 WatchSource:0}: Error finding container 73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035: Status 404 returned error can't find the container with id 73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035 Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.282455 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6d20-account-create-update-8zrt2"] Feb 02 10:51:28 crc kubenswrapper[4845]: W0202 10:51:28.290547 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod802ba94f_17f1_4eed_93aa_95e5ffe1ea43.slice/crio-55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b WatchSource:0}: Error finding container 55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b: Status 404 returned error can't find the container with id 55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.454854 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-42da-account-create-update-dmqrb"] Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.467149 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-hchq8"] Feb 02 10:51:28 crc kubenswrapper[4845]: W0202 10:51:28.477693 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05069a45_f3d6_43e9_bf29_2e3a3cbcc2d8.slice/crio-9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe WatchSource:0}: Error finding container 9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe: Status 404 returned error can't find the container with id 9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.997298 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53989098-3602-4958-96b3-ca7c539c29c9","Type":"ContainerStarted","Data":"6ec8de5a3c76a8cb59a7c822a25eb9c56d22cb083abcbfa3c4d66b41f0e8f30e"} Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.997366 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"53989098-3602-4958-96b3-ca7c539c29c9","Type":"ContainerStarted","Data":"e866c831e8a5fb4477a08a67dda617fad70da77f2dc1c392e2e10efc0f5560a9"} Feb 02 10:51:28 crc kubenswrapper[4845]: I0202 10:51:28.999806 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.001871 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hchq8" event={"ID":"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8","Type":"ContainerStarted","Data":"2b3f2d6cbc2fbaafd3e7acd158c2d8862a6fa7d667477c7485ce11aa580584b6"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.001920 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hchq8" event={"ID":"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8","Type":"ContainerStarted","Data":"9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.005864 4845 generic.go:334] "Generic (PLEG): container finished" podID="802ba94f-17f1-4eed-93aa-95e5ffe1ea43" containerID="f34968fe05a8bf94342bdc3d85ae1b9aa88e7cf9dc5bd5dd49c6ff1a1947185f" exitCode=0 Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.006047 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d20-account-create-update-8zrt2" event={"ID":"802ba94f-17f1-4eed-93aa-95e5ffe1ea43","Type":"ContainerDied","Data":"f34968fe05a8bf94342bdc3d85ae1b9aa88e7cf9dc5bd5dd49c6ff1a1947185f"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.006149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d20-account-create-update-8zrt2" event={"ID":"802ba94f-17f1-4eed-93aa-95e5ffe1ea43","Type":"ContainerStarted","Data":"55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.011300 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42da-account-create-update-dmqrb" event={"ID":"1f4db3a3-fdab-41f0-b675-26aaaa575769","Type":"ContainerStarted","Data":"a6fc1e9766e80a7882547aca33a905323116744093fc66e9ca25989843c77b7a"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.011360 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42da-account-create-update-dmqrb" event={"ID":"1f4db3a3-fdab-41f0-b675-26aaaa575769","Type":"ContainerStarted","Data":"91e748b8a84823fc9d68b855d19651a147634d90eb6ab4b1022d6097d05ec54c"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.016451 4845 generic.go:334] "Generic (PLEG): container finished" podID="82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" containerID="4440ff2b707ea7d06429dcadf4f2755be4ff5fe9e3d35fbb9b3d5449440ebcdb" exitCode=0 Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.016786 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8crj" event={"ID":"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e","Type":"ContainerDied","Data":"4440ff2b707ea7d06429dcadf4f2755be4ff5fe9e3d35fbb9b3d5449440ebcdb"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.016829 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8crj" event={"ID":"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e","Type":"ContainerStarted","Data":"73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035"} Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.034106 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.355527538 podStartE2EDuration="5.034082963s" podCreationTimestamp="2026-02-02 10:51:24 +0000 UTC" firstStartedPulling="2026-02-02 10:51:25.938353064 +0000 UTC m=+1167.029754514" lastFinishedPulling="2026-02-02 10:51:27.616908479 +0000 UTC m=+1168.708309939" observedRunningTime="2026-02-02 10:51:29.025366874 +0000 UTC m=+1170.116768324" watchObservedRunningTime="2026-02-02 10:51:29.034082963 +0000 UTC m=+1170.125484413" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.046824 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-42da-account-create-update-dmqrb" podStartSLOduration=2.046804556 podStartE2EDuration="2.046804556s" podCreationTimestamp="2026-02-02 10:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:29.040011772 +0000 UTC m=+1170.131413222" watchObservedRunningTime="2026-02-02 10:51:29.046804556 +0000 UTC m=+1170.138206006" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.059286 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-hchq8" podStartSLOduration=2.059265672 podStartE2EDuration="2.059265672s" podCreationTimestamp="2026-02-02 10:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:29.055852255 +0000 UTC m=+1170.147253705" watchObservedRunningTime="2026-02-02 10:51:29.059265672 +0000 UTC m=+1170.150667122" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.295938 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-wlplx"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.303375 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.313625 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-wlplx"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.348826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9z2\" (UniqueName: \"kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.348998 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.386577 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.421147 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.426850 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.451101 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.451129 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.451286 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9z2\" (UniqueName: \"kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.452524 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.499191 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-2fba-account-create-update-57wqb"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.500966 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.505156 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.517580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9z2\" (UniqueName: \"kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2\") pod \"mysqld-exporter-openstack-db-create-wlplx\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.562662 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2fba-account-create-update-57wqb"] Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.563715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.563763 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.563841 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.563949 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.563981 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.564040 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.564073 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllkd\" (UniqueName: \"kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.625598 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllkd\" (UniqueName: \"kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672120 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672146 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672221 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672284 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672312 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.672354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.673230 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.673840 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.674275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.690139 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.691426 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.723495 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd\") pod \"mysqld-exporter-2fba-account-create-update-57wqb\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.728627 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllkd\" (UniqueName: \"kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd\") pod \"dnsmasq-dns-698758b865-7lh98\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.762012 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.775301 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db65c5b8-1fc4-43f9-89bd-51ebb710eccd" path="/var/lib/kubelet/pods/db65c5b8-1fc4-43f9-89bd-51ebb710eccd/volumes" Feb 02 10:51:29 crc kubenswrapper[4845]: I0202 10:51:29.900713 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.039088 4845 generic.go:334] "Generic (PLEG): container finished" podID="05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" containerID="2b3f2d6cbc2fbaafd3e7acd158c2d8862a6fa7d667477c7485ce11aa580584b6" exitCode=0 Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.039168 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hchq8" event={"ID":"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8","Type":"ContainerDied","Data":"2b3f2d6cbc2fbaafd3e7acd158c2d8862a6fa7d667477c7485ce11aa580584b6"} Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.069663 4845 generic.go:334] "Generic (PLEG): container finished" podID="1f4db3a3-fdab-41f0-b675-26aaaa575769" containerID="a6fc1e9766e80a7882547aca33a905323116744093fc66e9ca25989843c77b7a" exitCode=0 Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.070048 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="dnsmasq-dns" containerID="cri-o://807d2bbccada01aa21068396fdc8f2bcadc102372f4ecccdc24ca17e629e8a36" gracePeriod=10 Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.070036 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42da-account-create-update-dmqrb" event={"ID":"1f4db3a3-fdab-41f0-b675-26aaaa575769","Type":"ContainerDied","Data":"a6fc1e9766e80a7882547aca33a905323116744093fc66e9ca25989843c77b7a"} Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.151767 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.305580 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.564190 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.571328 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.573577 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.573791 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.579945 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jqmgx" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.580182 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.584064 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703362 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6db6e42-984a-484b-9f90-e6efa9817f37-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703441 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwnk\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-kube-api-access-cvwnk\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703484 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703568 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-lock\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703605 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-cache\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.703809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.807402 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwnk\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-kube-api-access-cvwnk\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.807672 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.807874 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-lock\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.807956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-cache\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.808042 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.808083 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6db6e42-984a-484b-9f90-e6efa9817f37-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: E0202 10:51:30.811429 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:30 crc kubenswrapper[4845]: E0202 10:51:30.811458 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:30 crc kubenswrapper[4845]: E0202 10:51:30.811506 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:31.311485491 +0000 UTC m=+1172.402886991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.811692 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-cache\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.811714 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d6db6e42-984a-484b-9f90-e6efa9817f37-lock\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.816614 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.816653 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/889f87c871906604b344aa5a2d9d655cfec3434c73aede0f647c6d1c9bfbfe68/globalmount\"" pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.816992 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6db6e42-984a-484b-9f90-e6efa9817f37-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.826735 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwnk\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-kube-api-access-cvwnk\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.827871 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-wlplx"] Feb 02 10:51:30 crc kubenswrapper[4845]: W0202 10:51:30.828454 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d4f7cb3_0991_4ce6_a69d_fd6f17bbc2fc.slice/crio-2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08 WatchSource:0}: Error finding container 2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08: Status 404 returned error can't find the container with id 2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08 Feb 02 10:51:30 crc kubenswrapper[4845]: I0202 10:51:30.877299 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeab7da0-b130-4af5-8fde-0e9836ac2c44\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.089036 4845 generic.go:334] "Generic (PLEG): container finished" podID="47456531-c404-4086-89b2-d159d71fdeb1" containerID="807d2bbccada01aa21068396fdc8f2bcadc102372f4ecccdc24ca17e629e8a36" exitCode=0 Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.089735 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" event={"ID":"47456531-c404-4086-89b2-d159d71fdeb1","Type":"ContainerDied","Data":"807d2bbccada01aa21068396fdc8f2bcadc102372f4ecccdc24ca17e629e8a36"} Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.092186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6d20-account-create-update-8zrt2" event={"ID":"802ba94f-17f1-4eed-93aa-95e5ffe1ea43","Type":"ContainerDied","Data":"55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b"} Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.092226 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55a87cbf22b74d58f38660dbd860ecac7e0253e0080661b0a82322c06749f43b" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.104972 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" event={"ID":"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc","Type":"ContainerStarted","Data":"2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08"} Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.108721 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:51:31 crc kubenswrapper[4845]: W0202 10:51:31.118002 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc354af6_cf06_4532_83c7_845e6f8f41c5.slice/crio-2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11 WatchSource:0}: Error finding container 2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11: Status 404 returned error can't find the container with id 2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11 Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.121951 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-2fba-account-create-update-57wqb"] Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.122189 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8crj" event={"ID":"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e","Type":"ContainerDied","Data":"73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035"} Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.122226 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73664e3bb52176d824352579e8180b74b675bbf73a6874493edb31b304c82035" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.169064 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.181795 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.340353 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzb5t\" (UniqueName: \"kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t\") pod \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.340451 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5cd5\" (UniqueName: \"kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5\") pod \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.340470 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts\") pod \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\" (UID: \"802ba94f-17f1-4eed-93aa-95e5ffe1ea43\") " Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.340545 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts\") pod \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\" (UID: \"82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e\") " Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.340980 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:31 crc kubenswrapper[4845]: E0202 10:51:31.341229 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:31 crc kubenswrapper[4845]: E0202 10:51:31.341256 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:31 crc kubenswrapper[4845]: E0202 10:51:31.341307 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:32.341288799 +0000 UTC m=+1173.432690249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.343747 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "802ba94f-17f1-4eed-93aa-95e5ffe1ea43" (UID: "802ba94f-17f1-4eed-93aa-95e5ffe1ea43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.344000 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" (UID: "82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.364571 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t" (OuterVolumeSpecName: "kube-api-access-jzb5t") pod "82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" (UID: "82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e"). InnerVolumeSpecName "kube-api-access-jzb5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.371731 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5" (OuterVolumeSpecName: "kube-api-access-k5cd5") pod "802ba94f-17f1-4eed-93aa-95e5ffe1ea43" (UID: "802ba94f-17f1-4eed-93aa-95e5ffe1ea43"). InnerVolumeSpecName "kube-api-access-k5cd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.443467 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzb5t\" (UniqueName: \"kubernetes.io/projected/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-kube-api-access-jzb5t\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.443505 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.443519 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5cd5\" (UniqueName: \"kubernetes.io/projected/802ba94f-17f1-4eed-93aa-95e5ffe1ea43-kube-api-access-k5cd5\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:31 crc kubenswrapper[4845]: I0202 10:51:31.443530 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.046201 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hchq8" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.065636 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.091796 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.138049 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" event={"ID":"47456531-c404-4086-89b2-d159d71fdeb1","Type":"ContainerDied","Data":"51a381a3e9440b9cc201282058f7396573a6b9b5751fc506fe374daac0e1764f"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.138081 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-gfr4t" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.138125 4845 scope.go:117] "RemoveContainer" containerID="807d2bbccada01aa21068396fdc8f2bcadc102372f4ecccdc24ca17e629e8a36" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.143384 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-hchq8" event={"ID":"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8","Type":"ContainerDied","Data":"9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.143421 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb9c4d101e2d4087cd50fbee98f35575a3990350ca5373046dc6cec8a9f46fe" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.143478 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-hchq8" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.158665 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42da-account-create-update-dmqrb" event={"ID":"1f4db3a3-fdab-41f0-b675-26aaaa575769","Type":"ContainerDied","Data":"91e748b8a84823fc9d68b855d19651a147634d90eb6ab4b1022d6097d05ec54c"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.158704 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e748b8a84823fc9d68b855d19651a147634d90eb6ab4b1022d6097d05ec54c" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.158797 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42da-account-create-update-dmqrb" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.160547 4845 generic.go:334] "Generic (PLEG): container finished" podID="8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" containerID="2206de97b65437900f5876967e6088126d1863f30884da3104dd45175b5b4a13" exitCode=0 Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.160593 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" event={"ID":"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc","Type":"ContainerDied","Data":"2206de97b65437900f5876967e6088126d1863f30884da3104dd45175b5b4a13"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.162555 4845 generic.go:334] "Generic (PLEG): container finished" podID="fc354af6-cf06-4532-83c7-845e6f8f41c5" containerID="a93a594f3ef1f5fe22327995e92493ed98d1dba81262b5bcf617c4c84d0e3aba" exitCode=0 Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.162612 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" event={"ID":"fc354af6-cf06-4532-83c7-845e6f8f41c5","Type":"ContainerDied","Data":"a93a594f3ef1f5fe22327995e92493ed98d1dba81262b5bcf617c4c84d0e3aba"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.162630 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" event={"ID":"fc354af6-cf06-4532-83c7-845e6f8f41c5","Type":"ContainerStarted","Data":"2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.163779 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6k7b\" (UniqueName: \"kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b\") pod \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.163870 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts\") pod \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\" (UID: \"05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.165343 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" (UID: "05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.170150 4845 generic.go:334] "Generic (PLEG): container finished" podID="125bfda8-e971-4249-8b07-0bbff61e4725" containerID="ddd6909bbdf16b8af6fed8c71335e6bc1892bf152856011ceb433a2a497011e2" exitCode=0 Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.170233 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8crj" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.170571 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7lh98" event={"ID":"125bfda8-e971-4249-8b07-0bbff61e4725","Type":"ContainerDied","Data":"ddd6909bbdf16b8af6fed8c71335e6bc1892bf152856011ceb433a2a497011e2"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.170618 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7lh98" event={"ID":"125bfda8-e971-4249-8b07-0bbff61e4725","Type":"ContainerStarted","Data":"d8710db5f1971bcb1ada6e2682b3528a8c529ad636b2e603fac42dddaaffa6b0"} Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.170668 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6d20-account-create-update-8zrt2" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.173196 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b" (OuterVolumeSpecName: "kube-api-access-t6k7b") pod "05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" (UID: "05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8"). InnerVolumeSpecName "kube-api-access-t6k7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269523 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config\") pod \"47456531-c404-4086-89b2-d159d71fdeb1\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269633 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-297q5\" (UniqueName: \"kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5\") pod \"47456531-c404-4086-89b2-d159d71fdeb1\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269667 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc\") pod \"47456531-c404-4086-89b2-d159d71fdeb1\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269711 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz65m\" (UniqueName: \"kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m\") pod \"1f4db3a3-fdab-41f0-b675-26aaaa575769\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269828 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts\") pod \"1f4db3a3-fdab-41f0-b675-26aaaa575769\" (UID: \"1f4db3a3-fdab-41f0-b675-26aaaa575769\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.269854 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb\") pod \"47456531-c404-4086-89b2-d159d71fdeb1\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.270505 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb\") pod \"47456531-c404-4086-89b2-d159d71fdeb1\" (UID: \"47456531-c404-4086-89b2-d159d71fdeb1\") " Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.271151 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.271173 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6k7b\" (UniqueName: \"kubernetes.io/projected/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8-kube-api-access-t6k7b\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.272546 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f4db3a3-fdab-41f0-b675-26aaaa575769" (UID: "1f4db3a3-fdab-41f0-b675-26aaaa575769"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.276773 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m" (OuterVolumeSpecName: "kube-api-access-vz65m") pod "1f4db3a3-fdab-41f0-b675-26aaaa575769" (UID: "1f4db3a3-fdab-41f0-b675-26aaaa575769"). InnerVolumeSpecName "kube-api-access-vz65m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.282297 4845 scope.go:117] "RemoveContainer" containerID="30b7a3b349a6bebefe667d17eaae2b3cbdc973ed20f342ad0701818a0d0cb4b7" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.295132 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5" (OuterVolumeSpecName: "kube-api-access-297q5") pod "47456531-c404-4086-89b2-d159d71fdeb1" (UID: "47456531-c404-4086-89b2-d159d71fdeb1"). InnerVolumeSpecName "kube-api-access-297q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.416218 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.416276 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.416373 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:34.416339664 +0000 UTC m=+1175.507741124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.412094 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.447335 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-297q5\" (UniqueName: \"kubernetes.io/projected/47456531-c404-4086-89b2-d159d71fdeb1-kube-api-access-297q5\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.447377 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz65m\" (UniqueName: \"kubernetes.io/projected/1f4db3a3-fdab-41f0-b675-26aaaa575769-kube-api-access-vz65m\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.447406 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f4db3a3-fdab-41f0-b675-26aaaa575769-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.455399 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "47456531-c404-4086-89b2-d159d71fdeb1" (UID: "47456531-c404-4086-89b2-d159d71fdeb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.500427 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "47456531-c404-4086-89b2-d159d71fdeb1" (UID: "47456531-c404-4086-89b2-d159d71fdeb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.501005 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config" (OuterVolumeSpecName: "config") pod "47456531-c404-4086-89b2-d159d71fdeb1" (UID: "47456531-c404-4086-89b2-d159d71fdeb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.540540 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "47456531-c404-4086-89b2-d159d71fdeb1" (UID: "47456531-c404-4086-89b2-d159d71fdeb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.553812 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.553846 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.553872 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.553893 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/47456531-c404-4086-89b2-d159d71fdeb1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.725154 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4gj6v"] Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.725952 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.725975 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.726002 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="802ba94f-17f1-4eed-93aa-95e5ffe1ea43" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726011 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="802ba94f-17f1-4eed-93aa-95e5ffe1ea43" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.726036 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="init" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726045 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="init" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.726060 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="dnsmasq-dns" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726071 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="dnsmasq-dns" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.726086 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726095 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: E0202 10:51:32.726116 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f4db3a3-fdab-41f0-b675-26aaaa575769" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726125 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f4db3a3-fdab-41f0-b675-26aaaa575769" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726396 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726423 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f4db3a3-fdab-41f0-b675-26aaaa575769" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726439 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="47456531-c404-4086-89b2-d159d71fdeb1" containerName="dnsmasq-dns" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726476 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" containerName="mariadb-database-create" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.726493 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="802ba94f-17f1-4eed-93aa-95e5ffe1ea43" containerName="mariadb-account-create-update" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.727419 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.741819 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4gj6v"] Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.788443 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1e2c-account-create-update-jxgpc"] Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.790128 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.792758 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.802584 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1e2c-account-create-update-jxgpc"] Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.850115 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.863148 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rwsf\" (UniqueName: \"kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.863335 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvxqp\" (UniqueName: \"kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.863592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.863643 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.871803 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-gfr4t"] Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.965707 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.965797 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.965876 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rwsf\" (UniqueName: \"kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.965980 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvxqp\" (UniqueName: \"kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.966612 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.967242 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.983824 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvxqp\" (UniqueName: \"kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp\") pod \"glance-db-create-4gj6v\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:32 crc kubenswrapper[4845]: I0202 10:51:32.986583 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rwsf\" (UniqueName: \"kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf\") pod \"glance-1e2c-account-create-update-jxgpc\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.110359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.121792 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.188664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7lh98" event={"ID":"125bfda8-e971-4249-8b07-0bbff61e4725","Type":"ContainerStarted","Data":"7b934d9dc539ae78cc75a18c8a04ead9cef0ee49e2411933655a99884f524af5"} Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.189916 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.690123 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.710476 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-7lh98" podStartSLOduration=4.710456273 podStartE2EDuration="4.710456273s" podCreationTimestamp="2026-02-02 10:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:33.214100851 +0000 UTC m=+1174.305502301" watchObservedRunningTime="2026-02-02 10:51:33.710456273 +0000 UTC m=+1174.801857733" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.733484 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47456531-c404-4086-89b2-d159d71fdeb1" path="/var/lib/kubelet/pods/47456531-c404-4086-89b2-d159d71fdeb1/volumes" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.785313 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts\") pod \"fc354af6-cf06-4532-83c7-845e6f8f41c5\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.785434 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd\") pod \"fc354af6-cf06-4532-83c7-845e6f8f41c5\" (UID: \"fc354af6-cf06-4532-83c7-845e6f8f41c5\") " Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.787190 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc354af6-cf06-4532-83c7-845e6f8f41c5" (UID: "fc354af6-cf06-4532-83c7-845e6f8f41c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.793112 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd" (OuterVolumeSpecName: "kube-api-access-lkvmd") pod "fc354af6-cf06-4532-83c7-845e6f8f41c5" (UID: "fc354af6-cf06-4532-83c7-845e6f8f41c5"). InnerVolumeSpecName "kube-api-access-lkvmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.828090 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4gj6v"] Feb 02 10:51:33 crc kubenswrapper[4845]: W0202 10:51:33.833387 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa851884_d67b_4c70_8ad6_9dcf92001aa5.slice/crio-7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0 WatchSource:0}: Error finding container 7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0: Status 404 returned error can't find the container with id 7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0 Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.888375 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc354af6-cf06-4532-83c7-845e6f8f41c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.888418 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/fc354af6-cf06-4532-83c7-845e6f8f41c5-kube-api-access-lkvmd\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:33 crc kubenswrapper[4845]: I0202 10:51:33.985847 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1e2c-account-create-update-jxgpc"] Feb 02 10:51:33 crc kubenswrapper[4845]: W0202 10:51:33.993498 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13911fd9_043e_424e_ba84_da6af616a202.slice/crio-9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5 WatchSource:0}: Error finding container 9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5: Status 404 returned error can't find the container with id 9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5 Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.014764 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.091554 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts\") pod \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.091998 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c9z2\" (UniqueName: \"kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2\") pod \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\" (UID: \"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc\") " Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.092421 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" (UID: "8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.093051 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.101913 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2" (OuterVolumeSpecName: "kube-api-access-5c9z2") pod "8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" (UID: "8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc"). InnerVolumeSpecName "kube-api-access-5c9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.195573 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c9z2\" (UniqueName: \"kubernetes.io/projected/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc-kube-api-access-5c9z2\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.214744 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1e2c-account-create-update-jxgpc" event={"ID":"13911fd9-043e-424e-ba84-da6af616a202","Type":"ContainerStarted","Data":"3459391ca6853a62f9d027449b7b8a2cf779254301205a614a5d854038e87879"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.214795 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1e2c-account-create-update-jxgpc" event={"ID":"13911fd9-043e-424e-ba84-da6af616a202","Type":"ContainerStarted","Data":"9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.217280 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" event={"ID":"8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc","Type":"ContainerDied","Data":"2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.217500 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a68ae9cc8e3b067fc0a7eef64277e6ebec856643c581c42297e123e19818a08" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.217294 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-wlplx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.232524 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" event={"ID":"fc354af6-cf06-4532-83c7-845e6f8f41c5","Type":"ContainerDied","Data":"2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.232592 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5d308418e3df9d85a25d801311a5f584b57fecacd8d2dc49cfb886adff6b11" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.232713 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-2fba-account-create-update-57wqb" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.241752 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4gj6v" event={"ID":"aa851884-d67b-4c70-8ad6-9dcf92001aa5","Type":"ContainerStarted","Data":"c7c52a689d78c280fc8b76cead8491d0d675cc8f51ac92d599f8ca929b26e719"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.241827 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4gj6v" event={"ID":"aa851884-d67b-4c70-8ad6-9dcf92001aa5","Type":"ContainerStarted","Data":"7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0"} Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.265099 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-1e2c-account-create-update-jxgpc" podStartSLOduration=2.265079389 podStartE2EDuration="2.265079389s" podCreationTimestamp="2026-02-02 10:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:34.229276287 +0000 UTC m=+1175.320677737" watchObservedRunningTime="2026-02-02 10:51:34.265079389 +0000 UTC m=+1175.356480839" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.285077 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4gj6v" podStartSLOduration=2.285056079 podStartE2EDuration="2.285056079s" podCreationTimestamp="2026-02-02 10:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:34.279291905 +0000 UTC m=+1175.370693355" watchObservedRunningTime="2026-02-02 10:51:34.285056079 +0000 UTC m=+1175.376457529" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.382518 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bbgdx"] Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.383050 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" containerName="mariadb-database-create" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.383075 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" containerName="mariadb-database-create" Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.383094 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc354af6-cf06-4532-83c7-845e6f8f41c5" containerName="mariadb-account-create-update" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.383104 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc354af6-cf06-4532-83c7-845e6f8f41c5" containerName="mariadb-account-create-update" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.383386 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc354af6-cf06-4532-83c7-845e6f8f41c5" containerName="mariadb-account-create-update" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.383414 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" containerName="mariadb-database-create" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.384775 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.386314 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.401972 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bbgdx"] Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.454759 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gln2q"] Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.456100 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.459175 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.459354 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.459478 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507549 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507618 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvsgj\" (UniqueName: \"kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507646 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqv8p\" (UniqueName: \"kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507678 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507722 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507737 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507770 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507830 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507857 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.507874 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.508053 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.508068 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.508133 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:38.508116748 +0000 UTC m=+1179.599518198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.508379 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gln2q"] Feb 02 10:51:34 crc kubenswrapper[4845]: E0202 10:51:34.517408 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-dvsgj ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-gln2q" podUID="353b5053-2393-4c95-9800-fc96032fe017" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.555010 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-fwkp8"] Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.562675 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.575484 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gln2q"] Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.586649 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fwkp8"] Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.609777 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.609847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.609873 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.609931 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.609993 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvsgj\" (UniqueName: \"kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.610018 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqv8p\" (UniqueName: \"kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.610072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.610086 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.610123 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.612348 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.612356 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.612734 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.613051 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.615582 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.615615 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.615770 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.631296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvsgj\" (UniqueName: \"kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj\") pod \"swift-ring-rebalance-gln2q\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.634719 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqv8p\" (UniqueName: \"kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p\") pod \"root-account-create-update-bbgdx\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.711826 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.711945 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.711994 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v29f4\" (UniqueName: \"kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.712111 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.712166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.712236 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.712310 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.777392 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.814715 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.815587 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.815821 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.816168 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.815878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.816289 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.816843 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.817163 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.817272 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.817318 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v29f4\" (UniqueName: \"kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.822827 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.822979 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.827923 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:34 crc kubenswrapper[4845]: I0202 10:51:34.836721 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v29f4\" (UniqueName: \"kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4\") pod \"swift-ring-rebalance-fwkp8\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.061283 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.278970 4845 generic.go:334] "Generic (PLEG): container finished" podID="aa851884-d67b-4c70-8ad6-9dcf92001aa5" containerID="c7c52a689d78c280fc8b76cead8491d0d675cc8f51ac92d599f8ca929b26e719" exitCode=0 Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.279774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4gj6v" event={"ID":"aa851884-d67b-4c70-8ad6-9dcf92001aa5","Type":"ContainerDied","Data":"c7c52a689d78c280fc8b76cead8491d0d675cc8f51ac92d599f8ca929b26e719"} Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.289595 4845 generic.go:334] "Generic (PLEG): container finished" podID="13911fd9-043e-424e-ba84-da6af616a202" containerID="3459391ca6853a62f9d027449b7b8a2cf779254301205a614a5d854038e87879" exitCode=0 Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.289700 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.290522 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1e2c-account-create-update-jxgpc" event={"ID":"13911fd9-043e-424e-ba84-da6af616a202","Type":"ContainerDied","Data":"3459391ca6853a62f9d027449b7b8a2cf779254301205a614a5d854038e87879"} Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.302116 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:35 crc kubenswrapper[4845]: W0202 10:51:35.374893 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod340eb818_166a_42f4_a562_8ffa18018118.slice/crio-3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853 WatchSource:0}: Error finding container 3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853: Status 404 returned error can't find the container with id 3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853 Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.380731 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bbgdx"] Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430323 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430405 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430440 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430473 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430522 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430660 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430722 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvsgj\" (UniqueName: \"kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj\") pod \"353b5053-2393-4c95-9800-fc96032fe017\" (UID: \"353b5053-2393-4c95-9800-fc96032fe017\") " Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.430836 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.431425 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts" (OuterVolumeSpecName: "scripts") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.431764 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.432866 4845 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.432906 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/353b5053-2393-4c95-9800-fc96032fe017-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.432918 4845 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/353b5053-2393-4c95-9800-fc96032fe017-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.437012 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.437537 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj" (OuterVolumeSpecName: "kube-api-access-dvsgj") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "kube-api-access-dvsgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.437776 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.438757 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "353b5053-2393-4c95-9800-fc96032fe017" (UID: "353b5053-2393-4c95-9800-fc96032fe017"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.536214 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvsgj\" (UniqueName: \"kubernetes.io/projected/353b5053-2393-4c95-9800-fc96032fe017-kube-api-access-dvsgj\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.536243 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.536254 4845 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.536264 4845 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/353b5053-2393-4c95-9800-fc96032fe017-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.778638 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-fwkp8"] Feb 02 10:51:35 crc kubenswrapper[4845]: I0202 10:51:35.849764 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-58d87f97d7-w9v5x" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerName="console" containerID="cri-o://111832ec0e0c78c364956282064b05b1c4c2ce296de61e9f1fa4fa6702a3f91d" gracePeriod=15 Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.301198 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58d87f97d7-w9v5x_c26d4007-db0b-4379-8431-d6e43dec7e9f/console/0.log" Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.301243 4845 generic.go:334] "Generic (PLEG): container finished" podID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerID="111832ec0e0c78c364956282064b05b1c4c2ce296de61e9f1fa4fa6702a3f91d" exitCode=2 Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.301331 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58d87f97d7-w9v5x" event={"ID":"c26d4007-db0b-4379-8431-d6e43dec7e9f","Type":"ContainerDied","Data":"111832ec0e0c78c364956282064b05b1c4c2ce296de61e9f1fa4fa6702a3f91d"} Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.304544 4845 generic.go:334] "Generic (PLEG): container finished" podID="340eb818-166a-42f4-a562-8ffa18018118" containerID="9249fd2f5427ff24d36b33c0b16d83ec165b6227454b663781f59d842989c2da" exitCode=0 Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.304617 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bbgdx" event={"ID":"340eb818-166a-42f4-a562-8ffa18018118","Type":"ContainerDied","Data":"9249fd2f5427ff24d36b33c0b16d83ec165b6227454b663781f59d842989c2da"} Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.304650 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bbgdx" event={"ID":"340eb818-166a-42f4-a562-8ffa18018118","Type":"ContainerStarted","Data":"3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853"} Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.304670 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gln2q" Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.419737 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gln2q"] Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.444065 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gln2q"] Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.636510 4845 patch_prober.go:28] interesting pod/console-58d87f97d7-w9v5x container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.88:8443/health\": dial tcp 10.217.0.88:8443: connect: connection refused" start-of-body= Feb 02 10:51:36 crc kubenswrapper[4845]: I0202 10:51:36.636814 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-58d87f97d7-w9v5x" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.88:8443/health\": dial tcp 10.217.0.88:8443: connect: connection refused" Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.315814 4845 generic.go:334] "Generic (PLEG): container finished" podID="2e45ad6a-20f4-4da2-82b7-500ed29a0cd5" containerID="b3ca7a9a30c5133d142f8240cf49d184ac360dc0f74e089c87c96d3a92c7d96d" exitCode=0 Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.315870 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5","Type":"ContainerDied","Data":"b3ca7a9a30c5133d142f8240cf49d184ac360dc0f74e089c87c96d3a92c7d96d"} Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.318357 4845 generic.go:334] "Generic (PLEG): container finished" podID="70739f91-4fde-4bc2-b4e1-5bdb7cb0426c" containerID="16a87dbd784c474f3d4b29bfb2f9739515e8d502ec347cc5ddd63af0721bc8af" exitCode=0 Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.318396 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c","Type":"ContainerDied","Data":"16a87dbd784c474f3d4b29bfb2f9739515e8d502ec347cc5ddd63af0721bc8af"} Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.321972 4845 generic.go:334] "Generic (PLEG): container finished" podID="a61fa08e-868a-4415-88d5-7ed0eebbeb45" containerID="687c373278d72892bc7af53f43d957e03afb3a8da90e855595ea0df53482a4a4" exitCode=0 Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.322092 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a61fa08e-868a-4415-88d5-7ed0eebbeb45","Type":"ContainerDied","Data":"687c373278d72892bc7af53f43d957e03afb3a8da90e855595ea0df53482a4a4"} Feb 02 10:51:37 crc kubenswrapper[4845]: I0202 10:51:37.726862 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353b5053-2393-4c95-9800-fc96032fe017" path="/var/lib/kubelet/pods/353b5053-2393-4c95-9800-fc96032fe017/volumes" Feb 02 10:51:38 crc kubenswrapper[4845]: W0202 10:51:38.316360 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacbaf357_af6c_46b6_b6f0_de2b6e4ee44c.slice/crio-b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732 WatchSource:0}: Error finding container b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732: Status 404 returned error can't find the container with id b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732 Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.337154 4845 generic.go:334] "Generic (PLEG): container finished" podID="d0a3a285-364a-4df2-8a7c-947ff673f254" containerID="93c6ac3518c8da645fe78eb56c15a31adc12e1d3538e14b7359e676cb11918c9" exitCode=0 Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.337218 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d0a3a285-364a-4df2-8a7c-947ff673f254","Type":"ContainerDied","Data":"93c6ac3518c8da645fe78eb56c15a31adc12e1d3538e14b7359e676cb11918c9"} Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.340977 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bbgdx" event={"ID":"340eb818-166a-42f4-a562-8ffa18018118","Type":"ContainerDied","Data":"3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853"} Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.341006 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c2857e7e1d4b7826e96833417a57da6a98eed0c53437fe71531e7ded412b853" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.342379 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4gj6v" event={"ID":"aa851884-d67b-4c70-8ad6-9dcf92001aa5","Type":"ContainerDied","Data":"7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0"} Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.342398 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7585d99dc88ad40cca31f5ecdb033d0e22df31000a8f8f1c08b3e9e8a1e7d5d0" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.344764 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1e2c-account-create-update-jxgpc" event={"ID":"13911fd9-043e-424e-ba84-da6af616a202","Type":"ContainerDied","Data":"9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5"} Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.344909 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9493c3898fec2faa6595d94b43b866cde5372c5130166dddd2a82a77a44369f5" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.345904 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fwkp8" event={"ID":"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c","Type":"ContainerStarted","Data":"b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732"} Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.549388 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:38 crc kubenswrapper[4845]: E0202 10:51:38.549926 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:38 crc kubenswrapper[4845]: E0202 10:51:38.550001 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:38 crc kubenswrapper[4845]: E0202 10:51:38.550094 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:51:46.550071265 +0000 UTC m=+1187.641472715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.611351 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.680342 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.685224 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.756692 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqv8p\" (UniqueName: \"kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p\") pod \"340eb818-166a-42f4-a562-8ffa18018118\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.756837 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvxqp\" (UniqueName: \"kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp\") pod \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.756905 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts\") pod \"340eb818-166a-42f4-a562-8ffa18018118\" (UID: \"340eb818-166a-42f4-a562-8ffa18018118\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.757015 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rwsf\" (UniqueName: \"kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf\") pod \"13911fd9-043e-424e-ba84-da6af616a202\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.757120 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts\") pod \"13911fd9-043e-424e-ba84-da6af616a202\" (UID: \"13911fd9-043e-424e-ba84-da6af616a202\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.757187 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts\") pod \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\" (UID: \"aa851884-d67b-4c70-8ad6-9dcf92001aa5\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.764352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa851884-d67b-4c70-8ad6-9dcf92001aa5" (UID: "aa851884-d67b-4c70-8ad6-9dcf92001aa5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.765327 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p" (OuterVolumeSpecName: "kube-api-access-mqv8p") pod "340eb818-166a-42f4-a562-8ffa18018118" (UID: "340eb818-166a-42f4-a562-8ffa18018118"). InnerVolumeSpecName "kube-api-access-mqv8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.765991 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13911fd9-043e-424e-ba84-da6af616a202" (UID: "13911fd9-043e-424e-ba84-da6af616a202"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.766175 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "340eb818-166a-42f4-a562-8ffa18018118" (UID: "340eb818-166a-42f4-a562-8ffa18018118"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.774207 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp" (OuterVolumeSpecName: "kube-api-access-zvxqp") pod "aa851884-d67b-4c70-8ad6-9dcf92001aa5" (UID: "aa851884-d67b-4c70-8ad6-9dcf92001aa5"). InnerVolumeSpecName "kube-api-access-zvxqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.802593 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf" (OuterVolumeSpecName: "kube-api-access-7rwsf") pod "13911fd9-043e-424e-ba84-da6af616a202" (UID: "13911fd9-043e-424e-ba84-da6af616a202"). InnerVolumeSpecName "kube-api-access-7rwsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873370 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqv8p\" (UniqueName: \"kubernetes.io/projected/340eb818-166a-42f4-a562-8ffa18018118-kube-api-access-mqv8p\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873401 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvxqp\" (UniqueName: \"kubernetes.io/projected/aa851884-d67b-4c70-8ad6-9dcf92001aa5-kube-api-access-zvxqp\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873410 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/340eb818-166a-42f4-a562-8ffa18018118-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873420 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rwsf\" (UniqueName: \"kubernetes.io/projected/13911fd9-043e-424e-ba84-da6af616a202-kube-api-access-7rwsf\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873428 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13911fd9-043e-424e-ba84-da6af616a202-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.873437 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa851884-d67b-4c70-8ad6-9dcf92001aa5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.883962 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58d87f97d7-w9v5x_c26d4007-db0b-4379-8431-d6e43dec7e9f/console/0.log" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.884048 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.975310 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.975592 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.975714 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.975851 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.975971 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp6kt\" (UniqueName: \"kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.976963 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.977113 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert\") pod \"c26d4007-db0b-4379-8431-d6e43dec7e9f\" (UID: \"c26d4007-db0b-4379-8431-d6e43dec7e9f\") " Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.978681 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.979244 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca" (OuterVolumeSpecName: "service-ca") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.979562 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config" (OuterVolumeSpecName: "console-config") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.984314 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.989709 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.991051 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt" (OuterVolumeSpecName: "kube-api-access-fp6kt") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "kube-api-access-fp6kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4845]: I0202 10:51:38.991206 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c26d4007-db0b-4379-8431-d6e43dec7e9f" (UID: "c26d4007-db0b-4379-8431-d6e43dec7e9f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.080688 4845 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081077 4845 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081092 4845 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081106 4845 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081120 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp6kt\" (UniqueName: \"kubernetes.io/projected/c26d4007-db0b-4379-8431-d6e43dec7e9f-kube-api-access-fp6kt\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081133 4845 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.081145 4845 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c26d4007-db0b-4379-8431-d6e43dec7e9f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.358278 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerStarted","Data":"953536ca819321a406c08f163b4b6ff072898362e02b3adbe579d686cddbce8e"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.365415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"d0a3a285-364a-4df2-8a7c-947ff673f254","Type":"ContainerStarted","Data":"17bfdabfa77ecf7194164037270605b523d4995460e6380d856305c1f6c0057d"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.365675 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.369604 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"a61fa08e-868a-4415-88d5-7ed0eebbeb45","Type":"ContainerStarted","Data":"1350f0f44f8691a3e3bbd0753dc0b0d45e8adf60a35bc0be667a6517ed1450e4"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.370588 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.384509 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2e45ad6a-20f4-4da2-82b7-500ed29a0cd5","Type":"ContainerStarted","Data":"df55539d2d0f8d9c780d83c6d45d1c47cd8164e3c370d8ffca6b1ef3a6cabb0b"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.384961 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.388485 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58d87f97d7-w9v5x_c26d4007-db0b-4379-8431-d6e43dec7e9f/console/0.log" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.388609 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58d87f97d7-w9v5x" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.388601 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58d87f97d7-w9v5x" event={"ID":"c26d4007-db0b-4379-8431-d6e43dec7e9f","Type":"ContainerDied","Data":"b03473270e278212cfa587bd83c780a50bc52c823bf36377b8f5efc441c8224f"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.388792 4845 scope.go:117] "RemoveContainer" containerID="111832ec0e0c78c364956282064b05b1c4c2ce296de61e9f1fa4fa6702a3f91d" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.395115 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bbgdx" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.395171 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"70739f91-4fde-4bc2-b4e1-5bdb7cb0426c","Type":"ContainerStarted","Data":"091da937f8a68216ad0de07a10e107852949dbe38d8a323313bf100aa1da6145"} Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.395421 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4gj6v" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.395673 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1e2c-account-create-update-jxgpc" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.423879 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=39.302130834 podStartE2EDuration="57.423851914s" podCreationTimestamp="2026-02-02 10:50:42 +0000 UTC" firstStartedPulling="2026-02-02 10:50:45.151792619 +0000 UTC m=+1126.243194069" lastFinishedPulling="2026-02-02 10:51:03.273513699 +0000 UTC m=+1144.364915149" observedRunningTime="2026-02-02 10:51:39.393456086 +0000 UTC m=+1180.484857536" watchObservedRunningTime="2026-02-02 10:51:39.423851914 +0000 UTC m=+1180.515253364" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.456279 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=39.228180102 podStartE2EDuration="57.456261769s" podCreationTimestamp="2026-02-02 10:50:42 +0000 UTC" firstStartedPulling="2026-02-02 10:50:45.015977969 +0000 UTC m=+1126.107379409" lastFinishedPulling="2026-02-02 10:51:03.244059626 +0000 UTC m=+1144.335461076" observedRunningTime="2026-02-02 10:51:39.43631688 +0000 UTC m=+1180.527718350" watchObservedRunningTime="2026-02-02 10:51:39.456261769 +0000 UTC m=+1180.547663219" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.481306 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.21157542 podStartE2EDuration="57.481285924s" podCreationTimestamp="2026-02-02 10:50:42 +0000 UTC" firstStartedPulling="2026-02-02 10:50:44.960470659 +0000 UTC m=+1126.051872109" lastFinishedPulling="2026-02-02 10:51:03.230181163 +0000 UTC m=+1144.321582613" observedRunningTime="2026-02-02 10:51:39.474976603 +0000 UTC m=+1180.566378053" watchObservedRunningTime="2026-02-02 10:51:39.481285924 +0000 UTC m=+1180.572687374" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.540288 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.547820 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58d87f97d7-w9v5x"] Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.569832 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.592196747 podStartE2EDuration="56.569813911s" podCreationTimestamp="2026-02-02 10:50:43 +0000 UTC" firstStartedPulling="2026-02-02 10:50:45.26638873 +0000 UTC m=+1126.357790180" lastFinishedPulling="2026-02-02 10:51:03.244005894 +0000 UTC m=+1144.335407344" observedRunningTime="2026-02-02 10:51:39.567085573 +0000 UTC m=+1180.658487023" watchObservedRunningTime="2026-02-02 10:51:39.569813911 +0000 UTC m=+1180.661215361" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.742079 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" path="/var/lib/kubelet/pods/c26d4007-db0b-4379-8431-d6e43dec7e9f/volumes" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.770083 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.840129 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-z27qx"] Feb 02 10:51:39 crc kubenswrapper[4845]: E0202 10:51:39.840753 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerName="console" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.840779 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerName="console" Feb 02 10:51:39 crc kubenswrapper[4845]: E0202 10:51:39.840797 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa851884-d67b-4c70-8ad6-9dcf92001aa5" containerName="mariadb-database-create" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.840804 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa851884-d67b-4c70-8ad6-9dcf92001aa5" containerName="mariadb-database-create" Feb 02 10:51:39 crc kubenswrapper[4845]: E0202 10:51:39.840822 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="340eb818-166a-42f4-a562-8ffa18018118" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.840828 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="340eb818-166a-42f4-a562-8ffa18018118" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: E0202 10:51:39.840843 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13911fd9-043e-424e-ba84-da6af616a202" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.840857 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="13911fd9-043e-424e-ba84-da6af616a202" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.841056 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa851884-d67b-4c70-8ad6-9dcf92001aa5" containerName="mariadb-database-create" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.841071 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c26d4007-db0b-4379-8431-d6e43dec7e9f" containerName="console" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.841085 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="13911fd9-043e-424e-ba84-da6af616a202" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.841096 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="340eb818-166a-42f4-a562-8ffa18018118" containerName="mariadb-account-create-update" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.841757 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.892247 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.892471 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="dnsmasq-dns" containerID="cri-o://81b2eedcdbc73132319f670807bdc308fb485bc4c468651b7572510bbe0cf821" gracePeriod=10 Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.896590 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.896915 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftsvx\" (UniqueName: \"kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.906402 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-z27qx"] Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.999014 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:39 crc kubenswrapper[4845]: I0202 10:51:39.999113 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftsvx\" (UniqueName: \"kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.000162 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.025083 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftsvx\" (UniqueName: \"kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx\") pod \"mysqld-exporter-openstack-cell1-db-create-z27qx\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.150768 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-c783-account-create-update-c8k62"] Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.152635 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.160242 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.162769 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.173755 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c783-account-create-update-c8k62"] Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.210221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85r5h\" (UniqueName: \"kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.210405 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.312380 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.312795 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85r5h\" (UniqueName: \"kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.313968 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.359192 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85r5h\" (UniqueName: \"kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h\") pod \"mysqld-exporter-c783-account-create-update-c8k62\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.425901 4845 generic.go:334] "Generic (PLEG): container finished" podID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerID="81b2eedcdbc73132319f670807bdc308fb485bc4c468651b7572510bbe0cf821" exitCode=0 Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.426897 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" event={"ID":"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e","Type":"ContainerDied","Data":"81b2eedcdbc73132319f670807bdc308fb485bc4c468651b7572510bbe0cf821"} Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.486482 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.899098 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-z27qx"] Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.913064 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bbgdx"] Feb 02 10:51:40 crc kubenswrapper[4845]: I0202 10:51:40.926773 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bbgdx"] Feb 02 10:51:41 crc kubenswrapper[4845]: I0202 10:51:41.738642 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340eb818-166a-42f4-a562-8ffa18018118" path="/var/lib/kubelet/pods/340eb818-166a-42f4-a562-8ffa18018118/volumes" Feb 02 10:51:41 crc kubenswrapper[4845]: I0202 10:51:41.892159 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.051427 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config\") pod \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.051661 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq6c6\" (UniqueName: \"kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6\") pod \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.051693 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc\") pod \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\" (UID: \"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e\") " Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.079149 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6" (OuterVolumeSpecName: "kube-api-access-gq6c6") pod "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" (UID: "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e"). InnerVolumeSpecName "kube-api-access-gq6c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.126668 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config" (OuterVolumeSpecName: "config") pod "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" (UID: "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.159194 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq6c6\" (UniqueName: \"kubernetes.io/projected/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-kube-api-access-gq6c6\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.159243 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.191774 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" (UID: "7d63dd57-08d9-4913-b1d3-36a9c8b5db2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.261628 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.451058 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" event={"ID":"7d63dd57-08d9-4913-b1d3-36a9c8b5db2e","Type":"ContainerDied","Data":"7586b993b7c8e47124f3e9c3b1a81c730d51fb7fd5b1683f7e203dc16fdb11b3"} Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.451596 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-w59pq" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.452007 4845 scope.go:117] "RemoveContainer" containerID="81b2eedcdbc73132319f670807bdc308fb485bc4c468651b7572510bbe0cf821" Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.454918 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerStarted","Data":"a90515d7d5931387eadbfcaa9c4213064cf00c7a915781e7cf67147b71f9ba30"} Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.488944 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:51:42 crc kubenswrapper[4845]: I0202 10:51:42.499327 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-w59pq"] Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.040561 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-kgn95"] Feb 02 10:51:43 crc kubenswrapper[4845]: E0202 10:51:43.041063 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="dnsmasq-dns" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.041081 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="dnsmasq-dns" Feb 02 10:51:43 crc kubenswrapper[4845]: E0202 10:51:43.041126 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="init" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.041135 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="init" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.041396 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" containerName="dnsmasq-dns" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.042257 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.046429 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.046684 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-snsd2" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.052066 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kgn95"] Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.180116 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.180391 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.180501 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkc25\" (UniqueName: \"kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.180599 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.283004 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.283139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.283170 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkc25\" (UniqueName: \"kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.283207 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.301828 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.301855 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.303255 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.309454 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkc25\" (UniqueName: \"kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25\") pod \"glance-db-sync-kgn95\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.353860 4845 scope.go:117] "RemoveContainer" containerID="932d8191c1cbfeca1c9bdbbf6bdc47b46318b7170046d467a566c55700027df8" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.368981 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgn95" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.468573 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" event={"ID":"712e6155-a77e-4f9c-9d55-a6edab62e9a7","Type":"ContainerStarted","Data":"46c4e9ee220ea6b984a975979b0bcc36085734f0da5fd8ac38f3c9ab84f3d5ad"} Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.791356 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d63dd57-08d9-4913-b1d3-36a9c8b5db2e" path="/var/lib/kubelet/pods/7d63dd57-08d9-4913-b1d3-36a9c8b5db2e/volumes" Feb 02 10:51:43 crc kubenswrapper[4845]: I0202 10:51:43.925443 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-c783-account-create-update-c8k62"] Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.085743 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-kgn95"] Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.481244 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fwkp8" event={"ID":"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c","Type":"ContainerStarted","Data":"de36731024371d8d225acca93d7d017177a15b01d17e1764ca4da25552e5472e"} Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.482642 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" event={"ID":"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd","Type":"ContainerStarted","Data":"955f18c841a8c16fdc8a34bb8452d5379275418e606966b811bfd890a335150b"} Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.487694 4845 generic.go:334] "Generic (PLEG): container finished" podID="712e6155-a77e-4f9c-9d55-a6edab62e9a7" containerID="e2da395f32221226555ec9a36b4c70b9ebd84972cf5fd496af49cd196e7172a2" exitCode=0 Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.487767 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" event={"ID":"712e6155-a77e-4f9c-9d55-a6edab62e9a7","Type":"ContainerDied","Data":"e2da395f32221226555ec9a36b4c70b9ebd84972cf5fd496af49cd196e7172a2"} Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.489079 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgn95" event={"ID":"34877df4-b654-4e0c-ac67-da6fd95c249d","Type":"ContainerStarted","Data":"46b3e4f9d3e3e603b74f4066b84d76558af5b054c23ef7c7b9de906dee295d9c"} Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.510045 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-fwkp8" podStartSLOduration=5.405812998 podStartE2EDuration="10.510022864s" podCreationTimestamp="2026-02-02 10:51:34 +0000 UTC" firstStartedPulling="2026-02-02 10:51:38.324753422 +0000 UTC m=+1179.416154872" lastFinishedPulling="2026-02-02 10:51:43.428963288 +0000 UTC m=+1184.520364738" observedRunningTime="2026-02-02 10:51:44.499172974 +0000 UTC m=+1185.590574424" watchObservedRunningTime="2026-02-02 10:51:44.510022864 +0000 UTC m=+1185.601424314" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.528575 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gxb44"] Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.530094 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.531834 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.544221 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gxb44"] Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.612863 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.617115 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.617214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7f7p\" (UniqueName: \"kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.719295 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.719419 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7f7p\" (UniqueName: \"kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.720106 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.737199 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7f7p\" (UniqueName: \"kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p\") pod \"root-account-create-update-gxb44\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:44 crc kubenswrapper[4845]: I0202 10:51:44.880566 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:45 crc kubenswrapper[4845]: I0202 10:51:45.177751 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 02 10:51:45 crc kubenswrapper[4845]: I0202 10:51:45.457524 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gxb44"] Feb 02 10:51:45 crc kubenswrapper[4845]: I0202 10:51:45.515901 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" event={"ID":"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd","Type":"ContainerStarted","Data":"cf759ca4ae8492477c32d3ee2af20b6a746da7ed09b5646264fc1f94d46044d6"} Feb 02 10:51:45 crc kubenswrapper[4845]: I0202 10:51:45.549204 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" podStartSLOduration=5.549182215 podStartE2EDuration="5.549182215s" podCreationTimestamp="2026-02-02 10:51:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:45.535372111 +0000 UTC m=+1186.626773561" watchObservedRunningTime="2026-02-02 10:51:45.549182215 +0000 UTC m=+1186.640583655" Feb 02 10:51:46 crc kubenswrapper[4845]: I0202 10:51:46.534253 4845 generic.go:334] "Generic (PLEG): container finished" podID="e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" containerID="cf759ca4ae8492477c32d3ee2af20b6a746da7ed09b5646264fc1f94d46044d6" exitCode=0 Feb 02 10:51:46 crc kubenswrapper[4845]: I0202 10:51:46.534474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" event={"ID":"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd","Type":"ContainerDied","Data":"cf759ca4ae8492477c32d3ee2af20b6a746da7ed09b5646264fc1f94d46044d6"} Feb 02 10:51:46 crc kubenswrapper[4845]: I0202 10:51:46.555576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:51:46 crc kubenswrapper[4845]: E0202 10:51:46.555813 4845 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:51:46 crc kubenswrapper[4845]: E0202 10:51:46.555832 4845 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:51:46 crc kubenswrapper[4845]: E0202 10:51:46.555901 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift podName:d6db6e42-984a-484b-9f90-e6efa9817f37 nodeName:}" failed. No retries permitted until 2026-02-02 10:52:02.555862778 +0000 UTC m=+1203.647264228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift") pod "swift-storage-0" (UID: "d6db6e42-984a-484b-9f90-e6efa9817f37") : configmap "swift-ring-files" not found Feb 02 10:51:47 crc kubenswrapper[4845]: W0202 10:51:47.285741 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8167c688_97fc_4a4b_9f1f_b0b037c86a9a.slice/crio-ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43 WatchSource:0}: Error finding container ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43: Status 404 returned error can't find the container with id ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43 Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.520695 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.579055 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftsvx\" (UniqueName: \"kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx\") pod \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.579186 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts\") pod \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\" (UID: \"712e6155-a77e-4f9c-9d55-a6edab62e9a7\") " Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.580969 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "712e6155-a77e-4f9c-9d55-a6edab62e9a7" (UID: "712e6155-a77e-4f9c-9d55-a6edab62e9a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.584567 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxb44" event={"ID":"8167c688-97fc-4a4b-9f1f-b0b037c86a9a","Type":"ContainerStarted","Data":"ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43"} Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.593930 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx" (OuterVolumeSpecName: "kube-api-access-ftsvx") pod "712e6155-a77e-4f9c-9d55-a6edab62e9a7" (UID: "712e6155-a77e-4f9c-9d55-a6edab62e9a7"). InnerVolumeSpecName "kube-api-access-ftsvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.602442 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.622521 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-z27qx" event={"ID":"712e6155-a77e-4f9c-9d55-a6edab62e9a7","Type":"ContainerDied","Data":"46c4e9ee220ea6b984a975979b0bcc36085734f0da5fd8ac38f3c9ab84f3d5ad"} Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.622567 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46c4e9ee220ea6b984a975979b0bcc36085734f0da5fd8ac38f3c9ab84f3d5ad" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.682200 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftsvx\" (UniqueName: \"kubernetes.io/projected/712e6155-a77e-4f9c-9d55-a6edab62e9a7-kube-api-access-ftsvx\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:47 crc kubenswrapper[4845]: I0202 10:51:47.682241 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/712e6155-a77e-4f9c-9d55-a6edab62e9a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.028482 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.087845 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85r5h\" (UniqueName: \"kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h\") pod \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.088137 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts\") pod \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\" (UID: \"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd\") " Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.089025 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" (UID: "e0fdfb88-9683-4cc2-95f1-6ab55c558dfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.094328 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h" (OuterVolumeSpecName: "kube-api-access-85r5h") pod "e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" (UID: "e0fdfb88-9683-4cc2-95f1-6ab55c558dfd"). InnerVolumeSpecName "kube-api-access-85r5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.189519 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85r5h\" (UniqueName: \"kubernetes.io/projected/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-kube-api-access-85r5h\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.189551 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.620228 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" event={"ID":"e0fdfb88-9683-4cc2-95f1-6ab55c558dfd","Type":"ContainerDied","Data":"955f18c841a8c16fdc8a34bb8452d5379275418e606966b811bfd890a335150b"} Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.620667 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955f18c841a8c16fdc8a34bb8452d5379275418e606966b811bfd890a335150b" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.621119 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-c783-account-create-update-c8k62" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.624239 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerStarted","Data":"2061b24c8461fbcce0ec47e59e7dc4cc77062ce142faa619d444cd1d0c09e5c8"} Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.626754 4845 generic.go:334] "Generic (PLEG): container finished" podID="8167c688-97fc-4a4b-9f1f-b0b037c86a9a" containerID="13f6f84ab8aba03eaa86df418135b90d82987c32100a7069900fe6528abad51b" exitCode=0 Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.626802 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxb44" event={"ID":"8167c688-97fc-4a4b-9f1f-b0b037c86a9a","Type":"ContainerDied","Data":"13f6f84ab8aba03eaa86df418135b90d82987c32100a7069900fe6528abad51b"} Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.653254 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.211530805 podStartE2EDuration="59.653230062s" podCreationTimestamp="2026-02-02 10:50:49 +0000 UTC" firstStartedPulling="2026-02-02 10:51:04.944360575 +0000 UTC m=+1146.035762025" lastFinishedPulling="2026-02-02 10:51:47.386059832 +0000 UTC m=+1188.477461282" observedRunningTime="2026-02-02 10:51:48.650691579 +0000 UTC m=+1189.742093049" watchObservedRunningTime="2026-02-02 10:51:48.653230062 +0000 UTC m=+1189.744631512" Feb 02 10:51:48 crc kubenswrapper[4845]: I0202 10:51:48.883092 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tt4db" podUID="72da7703-b176-47cb-953e-de037d663c55" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:51:48 crc kubenswrapper[4845]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:51:48 crc kubenswrapper[4845]: > Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.151486 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.223538 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:51:50 crc kubenswrapper[4845]: E0202 10:51:50.224084 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.224106 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: E0202 10:51:50.224124 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8167c688-97fc-4a4b-9f1f-b0b037c86a9a" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.224132 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8167c688-97fc-4a4b-9f1f-b0b037c86a9a" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: E0202 10:51:50.224153 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="712e6155-a77e-4f9c-9d55-a6edab62e9a7" containerName="mariadb-database-create" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.224160 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="712e6155-a77e-4f9c-9d55-a6edab62e9a7" containerName="mariadb-database-create" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.225030 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8167c688-97fc-4a4b-9f1f-b0b037c86a9a" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.225062 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" containerName="mariadb-account-create-update" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.225128 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="712e6155-a77e-4f9c-9d55-a6edab62e9a7" containerName="mariadb-database-create" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.226232 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.229140 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.240864 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.258554 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7f7p\" (UniqueName: \"kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p\") pod \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.258771 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts\") pod \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\" (UID: \"8167c688-97fc-4a4b-9f1f-b0b037c86a9a\") " Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.259321 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8167c688-97fc-4a4b-9f1f-b0b037c86a9a" (UID: "8167c688-97fc-4a4b-9f1f-b0b037c86a9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.267204 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p" (OuterVolumeSpecName: "kube-api-access-w7f7p") pod "8167c688-97fc-4a4b-9f1f-b0b037c86a9a" (UID: "8167c688-97fc-4a4b-9f1f-b0b037c86a9a"). InnerVolumeSpecName "kube-api-access-w7f7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.361412 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjmd\" (UniqueName: \"kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.361553 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.361580 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.361702 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7f7p\" (UniqueName: \"kubernetes.io/projected/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-kube-api-access-w7f7p\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.361719 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8167c688-97fc-4a4b-9f1f-b0b037c86a9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.463134 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.463189 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.463421 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjmd\" (UniqueName: \"kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.471072 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.489982 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.496028 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjmd\" (UniqueName: \"kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd\") pod \"mysqld-exporter-0\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.561697 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.589592 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.589678 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.591237 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.655702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gxb44" event={"ID":"8167c688-97fc-4a4b-9f1f-b0b037c86a9a","Type":"ContainerDied","Data":"ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43"} Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.655764 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce34701895725b3d5a3763408473ead104dae5a66310eff73d3f6c184da68a43" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.655955 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gxb44" Feb 02 10:51:50 crc kubenswrapper[4845]: I0202 10:51:50.657542 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 10:51:51 crc kubenswrapper[4845]: I0202 10:51:51.117708 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:51:51 crc kubenswrapper[4845]: W0202 10:51:51.119920 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada4f3a2_2715_4c0c_bc32_5c488a2e1996.slice/crio-4f3a714246c19659b6b750ef01e05f53d25576adc8d172e3db972b276255f379 WatchSource:0}: Error finding container 4f3a714246c19659b6b750ef01e05f53d25576adc8d172e3db972b276255f379: Status 404 returned error can't find the container with id 4f3a714246c19659b6b750ef01e05f53d25576adc8d172e3db972b276255f379 Feb 02 10:51:51 crc kubenswrapper[4845]: I0202 10:51:51.667633 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ada4f3a2-2715-4c0c-bc32-5c488a2e1996","Type":"ContainerStarted","Data":"4f3a714246c19659b6b750ef01e05f53d25576adc8d172e3db972b276255f379"} Feb 02 10:51:52 crc kubenswrapper[4845]: I0202 10:51:52.685235 4845 generic.go:334] "Generic (PLEG): container finished" podID="acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" containerID="de36731024371d8d225acca93d7d017177a15b01d17e1764ca4da25552e5472e" exitCode=0 Feb 02 10:51:52 crc kubenswrapper[4845]: I0202 10:51:52.685341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fwkp8" event={"ID":"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c","Type":"ContainerDied","Data":"de36731024371d8d225acca93d7d017177a15b01d17e1764ca4da25552e5472e"} Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.894711 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.895320 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" containerID="cri-o://953536ca819321a406c08f163b4b6ff072898362e02b3adbe579d686cddbce8e" gracePeriod=600 Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.895547 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="thanos-sidecar" containerID="cri-o://2061b24c8461fbcce0ec47e59e7dc4cc77062ce142faa619d444cd1d0c09e5c8" gracePeriod=600 Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.895713 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="config-reloader" containerID="cri-o://a90515d7d5931387eadbfcaa9c4213064cf00c7a915781e7cf67147b71f9ba30" gracePeriod=600 Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.912213 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tt4db" podUID="72da7703-b176-47cb-953e-de037d663c55" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:51:53 crc kubenswrapper[4845]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:51:53 crc kubenswrapper[4845]: > Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.956053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:51:53 crc kubenswrapper[4845]: I0202 10:51:53.968190 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9qwr2" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.170413 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2e45ad6a-20f4-4da2-82b7-500ed29a0cd5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.201929 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-tt4db-config-n8v8r"] Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.203468 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.207290 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.215758 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tt4db-config-n8v8r"] Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.255449 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="a61fa08e-868a-4415-88d5-7ed0eebbeb45" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.131:5671: connect: connection refused" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.355977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jb5d\" (UniqueName: \"kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.356200 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.356346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.356398 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.356439 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.356632 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jb5d\" (UniqueName: \"kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459072 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459157 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459199 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459232 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.459342 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.461220 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.461586 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.461677 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.462181 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.462698 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.490566 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jb5d\" (UniqueName: \"kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d\") pod \"ovn-controller-tt4db-config-n8v8r\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.515951 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="d0a3a285-364a-4df2-8a7c-947ff673f254" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.130:5671: connect: connection refused" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.552653 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.615108 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.758744 4845 generic.go:334] "Generic (PLEG): container finished" podID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerID="2061b24c8461fbcce0ec47e59e7dc4cc77062ce142faa619d444cd1d0c09e5c8" exitCode=0 Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.758795 4845 generic.go:334] "Generic (PLEG): container finished" podID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerID="a90515d7d5931387eadbfcaa9c4213064cf00c7a915781e7cf67147b71f9ba30" exitCode=0 Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.758805 4845 generic.go:334] "Generic (PLEG): container finished" podID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerID="953536ca819321a406c08f163b4b6ff072898362e02b3adbe579d686cddbce8e" exitCode=0 Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.759988 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerDied","Data":"2061b24c8461fbcce0ec47e59e7dc4cc77062ce142faa619d444cd1d0c09e5c8"} Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.760057 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerDied","Data":"a90515d7d5931387eadbfcaa9c4213064cf00c7a915781e7cf67147b71f9ba30"} Feb 02 10:51:54 crc kubenswrapper[4845]: I0202 10:51:54.760075 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerDied","Data":"953536ca819321a406c08f163b4b6ff072898362e02b3adbe579d686cddbce8e"} Feb 02 10:51:55 crc kubenswrapper[4845]: I0202 10:51:55.589920 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.139:9090/-/ready\": dial tcp 10.217.0.139:9090: connect: connection refused" Feb 02 10:51:55 crc kubenswrapper[4845]: I0202 10:51:55.923477 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gxb44"] Feb 02 10:51:55 crc kubenswrapper[4845]: I0202 10:51:55.932572 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gxb44"] Feb 02 10:51:57 crc kubenswrapper[4845]: I0202 10:51:57.727158 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8167c688-97fc-4a4b-9f1f-b0b037c86a9a" path="/var/lib/kubelet/pods/8167c688-97fc-4a4b-9f1f-b0b037c86a9a/volumes" Feb 02 10:51:58 crc kubenswrapper[4845]: I0202 10:51:58.897245 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-tt4db" podUID="72da7703-b176-47cb-953e-de037d663c55" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:51:58 crc kubenswrapper[4845]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:51:58 crc kubenswrapper[4845]: > Feb 02 10:52:00 crc kubenswrapper[4845]: I0202 10:52:00.590094 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.139:9090/-/ready\": dial tcp 10.217.0.139:9090: connect: connection refused" Feb 02 10:52:00 crc kubenswrapper[4845]: I0202 10:52:00.942550 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qqb26"] Feb 02 10:52:00 crc kubenswrapper[4845]: I0202 10:52:00.945141 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:00 crc kubenswrapper[4845]: I0202 10:52:00.948714 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 10:52:00 crc kubenswrapper[4845]: I0202 10:52:00.950845 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qqb26"] Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.103300 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.103457 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdkt\" (UniqueName: \"kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.205412 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdkt\" (UniqueName: \"kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.205583 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.206431 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.245664 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdkt\" (UniqueName: \"kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt\") pod \"root-account-create-update-qqb26\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.297783 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.506367 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.612504 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.612808 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.612875 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.612927 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.613028 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.613057 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v29f4\" (UniqueName: \"kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.613112 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf\") pod \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\" (UID: \"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.613665 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.615676 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.678163 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4" (OuterVolumeSpecName: "kube-api-access-v29f4") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "kube-api-access-v29f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.682206 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.712499 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.715868 4845 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.715909 4845 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.715920 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.715929 4845 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.715939 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v29f4\" (UniqueName: \"kubernetes.io/projected/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-kube-api-access-v29f4\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.741356 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.742152 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts" (OuterVolumeSpecName: "scripts") pod "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" (UID: "acbaf357-af6c-46b6-b6f0-de2b6e4ee44c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.809338 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.818442 4845 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.818478 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbaf357-af6c-46b6-b6f0-de2b6e4ee44c-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.867739 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-fwkp8" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.867746 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-fwkp8" event={"ID":"acbaf357-af6c-46b6-b6f0-de2b6e4ee44c","Type":"ContainerDied","Data":"b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732"} Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.867793 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3336106af570a904c4c4f983f9ee425832488db515a8996781482a3bc1d2732" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.895327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9b04f366-8a31-4d2e-8d11-e8682d578a07","Type":"ContainerDied","Data":"f66ab08a88fdf01ed8eac1ea6cefb40d4702621c1aec3526c050777cfd6e0be7"} Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.895433 4845 scope.go:117] "RemoveContainer" containerID="2061b24c8461fbcce0ec47e59e7dc4cc77062ce142faa619d444cd1d0c09e5c8" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.895761 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.913269 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ada4f3a2-2715-4c0c-bc32-5c488a2e1996","Type":"ContainerStarted","Data":"995f2b7a30c667d098b78f0a5f78fb72fb30f1f5da96bbc6a50e3a4f536e40bb"} Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.921273 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.921414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.921509 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.921601 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.921790 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.922610 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.923123 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md5br\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.923194 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.923483 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.923578 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.927472 4845 scope.go:117] "RemoveContainer" containerID="a90515d7d5931387eadbfcaa9c4213064cf00c7a915781e7cf67147b71f9ba30" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.928400 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.929435 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config" (OuterVolumeSpecName: "config") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.929607 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out" (OuterVolumeSpecName: "config-out") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.934570 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1\") pod \"9b04f366-8a31-4d2e-8d11-e8682d578a07\" (UID: \"9b04f366-8a31-4d2e-8d11-e8682d578a07\") " Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.942408 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br" (OuterVolumeSpecName: "kube-api-access-md5br") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "kube-api-access-md5br". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.942600 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.944181 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947241 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947724 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md5br\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-kube-api-access-md5br\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947757 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947771 4845 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947789 4845 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947804 4845 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9b04f366-8a31-4d2e-8d11-e8682d578a07-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947815 4845 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9b04f366-8a31-4d2e-8d11-e8682d578a07-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947826 4845 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.947835 4845 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9b04f366-8a31-4d2e-8d11-e8682d578a07-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.953672 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.769515347 podStartE2EDuration="11.953651727s" podCreationTimestamp="2026-02-02 10:51:50 +0000 UTC" firstStartedPulling="2026-02-02 10:51:51.126342004 +0000 UTC m=+1192.217743454" lastFinishedPulling="2026-02-02 10:52:01.310478384 +0000 UTC m=+1202.401879834" observedRunningTime="2026-02-02 10:52:01.943391615 +0000 UTC m=+1203.034793065" watchObservedRunningTime="2026-02-02 10:52:01.953651727 +0000 UTC m=+1203.045053177" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.981516 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config" (OuterVolumeSpecName: "web-config") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4845]: I0202 10:52:01.985949 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9b04f366-8a31-4d2e-8d11-e8682d578a07" (UID: "9b04f366-8a31-4d2e-8d11-e8682d578a07"). InnerVolumeSpecName "pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.037051 4845 scope.go:117] "RemoveContainer" containerID="953536ca819321a406c08f163b4b6ff072898362e02b3adbe579d686cddbce8e" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.055055 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") on node \"crc\" " Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.055101 4845 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9b04f366-8a31-4d2e-8d11-e8682d578a07-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.092879 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-tt4db-config-n8v8r"] Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.107520 4845 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.108199 4845 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937") on node "crc" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.114372 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qqb26"] Feb 02 10:52:02 crc kubenswrapper[4845]: W0202 10:52:02.128029 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62b9529c_8c20_47e9_8c19_910a31b30683.slice/crio-46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0 WatchSource:0}: Error finding container 46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0: Status 404 returned error can't find the container with id 46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0 Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.138570 4845 scope.go:117] "RemoveContainer" containerID="f241191074a6c7fafb8932f36a972acd0bd84c8ac50c73e751b3fdd46aa2e817" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.158935 4845 reconciler_common.go:293] "Volume detached for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.385933 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.405060 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422088 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:52:02 crc kubenswrapper[4845]: E0202 10:52:02.422509 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" containerName="swift-ring-rebalance" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422521 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" containerName="swift-ring-rebalance" Feb 02 10:52:02 crc kubenswrapper[4845]: E0202 10:52:02.422530 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="init-config-reloader" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422536 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="init-config-reloader" Feb 02 10:52:02 crc kubenswrapper[4845]: E0202 10:52:02.422545 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="config-reloader" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422553 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="config-reloader" Feb 02 10:52:02 crc kubenswrapper[4845]: E0202 10:52:02.422569 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="thanos-sidecar" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422575 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="thanos-sidecar" Feb 02 10:52:02 crc kubenswrapper[4845]: E0202 10:52:02.422592 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422597 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422805 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="prometheus" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422819 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="thanos-sidecar" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422832 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbaf357-af6c-46b6-b6f0-de2b6e4ee44c" containerName="swift-ring-rebalance" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.422838 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" containerName="config-reloader" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.424673 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.431870 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.432208 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.432403 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.434210 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.434475 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.434521 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.434772 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.435516 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.438959 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-wp8jb" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.443772 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31859db3-3de0-46d0-a81b-b951f1d45279-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571555 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571582 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571656 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571690 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571728 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571759 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571779 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571806 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571827 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571962 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7qv\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-kube-api-access-jg7qv\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.571989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.572027 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.591472 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6db6e42-984a-484b-9f90-e6efa9817f37-etc-swift\") pod \"swift-storage-0\" (UID: \"d6db6e42-984a-484b-9f90-e6efa9817f37\") " pod="openstack/swift-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.673490 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.674860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675001 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675123 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675250 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675484 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675691 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7qv\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-kube-api-access-jg7qv\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.675803 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.676448 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.676585 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31859db3-3de0-46d0-a81b-b951f1d45279-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.676735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.676838 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.678253 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.676463 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.682820 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.682862 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9b560f795087ddb8e1c0fbe0076d2f0e9dba0d3739abc904f350829f75b851b7/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.684433 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.687089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.687913 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-config\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.688032 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.689998 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.691489 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/31859db3-3de0-46d0-a81b-b951f1d45279-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.712532 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.720286 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31859db3-3de0-46d0-a81b-b951f1d45279-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.721853 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.725829 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7qv\" (UniqueName: \"kubernetes.io/projected/31859db3-3de0-46d0-a81b-b951f1d45279-kube-api-access-jg7qv\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.721870 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31859db3-3de0-46d0-a81b-b951f1d45279-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.768194 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c84ebbd0-5f32-4774-9e9f-fdc701180937\") pod \"prometheus-metric-storage-0\" (UID: \"31859db3-3de0-46d0-a81b-b951f1d45279\") " pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.928192 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgn95" event={"ID":"34877df4-b654-4e0c-ac67-da6fd95c249d","Type":"ContainerStarted","Data":"3a7ec3be2e02a83d1849c451b822789235edc5a4672079139f59894d6d036a70"} Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.940629 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db-config-n8v8r" event={"ID":"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7","Type":"ContainerStarted","Data":"2fbcebcb43f4c40bf6cc312cfba0d8248bb4757b182770fa199288683be575a0"} Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.940680 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db-config-n8v8r" event={"ID":"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7","Type":"ContainerStarted","Data":"34e5f3fc33a4028b7075045963bff9f4ca350c1123047d998cc5412b97fcacd8"} Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.944984 4845 generic.go:334] "Generic (PLEG): container finished" podID="62b9529c-8c20-47e9-8c19-910a31b30683" containerID="8f242d53b8b45b41f78ed81a7dfba88bead6b72ed12835b100468efa95d3864d" exitCode=0 Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.947926 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qqb26" event={"ID":"62b9529c-8c20-47e9-8c19-910a31b30683","Type":"ContainerDied","Data":"8f242d53b8b45b41f78ed81a7dfba88bead6b72ed12835b100468efa95d3864d"} Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.947987 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qqb26" event={"ID":"62b9529c-8c20-47e9-8c19-910a31b30683","Type":"ContainerStarted","Data":"46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0"} Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.950106 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-kgn95" podStartSLOduration=2.69404387 podStartE2EDuration="19.950086028s" podCreationTimestamp="2026-02-02 10:51:43 +0000 UTC" firstStartedPulling="2026-02-02 10:51:44.091280028 +0000 UTC m=+1185.182681478" lastFinishedPulling="2026-02-02 10:52:01.347322186 +0000 UTC m=+1202.438723636" observedRunningTime="2026-02-02 10:52:02.947579456 +0000 UTC m=+1204.038980906" watchObservedRunningTime="2026-02-02 10:52:02.950086028 +0000 UTC m=+1204.041487478" Feb 02 10:52:02 crc kubenswrapper[4845]: I0202 10:52:02.969221 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-tt4db-config-n8v8r" podStartSLOduration=8.969200803 podStartE2EDuration="8.969200803s" podCreationTimestamp="2026-02-02 10:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:02.9687255 +0000 UTC m=+1204.060126980" watchObservedRunningTime="2026-02-02 10:52:02.969200803 +0000 UTC m=+1204.060602253" Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.055765 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:03 crc kubenswrapper[4845]: W0202 10:52:03.474641 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6db6e42_984a_484b_9f90_e6efa9817f37.slice/crio-bb98dfc1bdb4a446eae63e770a43a380562aa21e2db2e8c9e29801ba96768172 WatchSource:0}: Error finding container bb98dfc1bdb4a446eae63e770a43a380562aa21e2db2e8c9e29801ba96768172: Status 404 returned error can't find the container with id bb98dfc1bdb4a446eae63e770a43a380562aa21e2db2e8c9e29801ba96768172 Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.477240 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.486784 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.663560 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.734057 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b04f366-8a31-4d2e-8d11-e8682d578a07" path="/var/lib/kubelet/pods/9b04f366-8a31-4d2e-8d11-e8682d578a07/volumes" Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.975598 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"bb98dfc1bdb4a446eae63e770a43a380562aa21e2db2e8c9e29801ba96768172"} Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.981456 4845 generic.go:334] "Generic (PLEG): container finished" podID="0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" containerID="2fbcebcb43f4c40bf6cc312cfba0d8248bb4757b182770fa199288683be575a0" exitCode=0 Feb 02 10:52:03 crc kubenswrapper[4845]: I0202 10:52:03.981564 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db-config-n8v8r" event={"ID":"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7","Type":"ContainerDied","Data":"2fbcebcb43f4c40bf6cc312cfba0d8248bb4757b182770fa199288683be575a0"} Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.001341 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerStarted","Data":"78313a6e4ad0baac042d452b1cef704646e4f126bb940a0867bc4032b6af8ae4"} Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.033535 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-tt4db" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.173127 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.258091 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.520393 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.526372 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.640733 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdkt\" (UniqueName: \"kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt\") pod \"62b9529c-8c20-47e9-8c19-910a31b30683\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.641105 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts\") pod \"62b9529c-8c20-47e9-8c19-910a31b30683\" (UID: \"62b9529c-8c20-47e9-8c19-910a31b30683\") " Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.645242 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62b9529c-8c20-47e9-8c19-910a31b30683" (UID: "62b9529c-8c20-47e9-8c19-910a31b30683"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.678366 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt" (OuterVolumeSpecName: "kube-api-access-rjdkt") pod "62b9529c-8c20-47e9-8c19-910a31b30683" (UID: "62b9529c-8c20-47e9-8c19-910a31b30683"). InnerVolumeSpecName "kube-api-access-rjdkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.743534 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62b9529c-8c20-47e9-8c19-910a31b30683-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:04 crc kubenswrapper[4845]: I0202 10:52:04.743560 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdkt\" (UniqueName: \"kubernetes.io/projected/62b9529c-8c20-47e9-8c19-910a31b30683-kube-api-access-rjdkt\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.012149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qqb26" event={"ID":"62b9529c-8c20-47e9-8c19-910a31b30683","Type":"ContainerDied","Data":"46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0"} Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.012221 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b1e7cd3db604e0df6f962e129a926ea3c2ab65a02772382c80d9cbafe9cef0" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.012165 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qqb26" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.507100 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559701 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559763 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559799 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run" (OuterVolumeSpecName: "var-run") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559875 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jb5d\" (UniqueName: \"kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559935 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.559961 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560084 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn\") pod \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\" (UID: \"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7\") " Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560299 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560495 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560802 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560921 4845 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560941 4845 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.560956 4845 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.561658 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts" (OuterVolumeSpecName: "scripts") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.570427 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d" (OuterVolumeSpecName: "kube-api-access-6jb5d") pod "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" (UID: "0bb68da2-57ab-4c1b-8e95-e1d434d6cae7"). InnerVolumeSpecName "kube-api-access-6jb5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.663650 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.664052 4845 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:05 crc kubenswrapper[4845]: I0202 10:52:05.664154 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jb5d\" (UniqueName: \"kubernetes.io/projected/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7-kube-api-access-6jb5d\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:06 crc kubenswrapper[4845]: I0202 10:52:06.030033 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-tt4db-config-n8v8r" event={"ID":"0bb68da2-57ab-4c1b-8e95-e1d434d6cae7","Type":"ContainerDied","Data":"34e5f3fc33a4028b7075045963bff9f4ca350c1123047d998cc5412b97fcacd8"} Feb 02 10:52:06 crc kubenswrapper[4845]: I0202 10:52:06.030373 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e5f3fc33a4028b7075045963bff9f4ca350c1123047d998cc5412b97fcacd8" Feb 02 10:52:06 crc kubenswrapper[4845]: I0202 10:52:06.030082 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-tt4db-config-n8v8r" Feb 02 10:52:06 crc kubenswrapper[4845]: I0202 10:52:06.649617 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-tt4db-config-n8v8r"] Feb 02 10:52:06 crc kubenswrapper[4845]: I0202 10:52:06.672852 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-tt4db-config-n8v8r"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.033770 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h6qld"] Feb 02 10:52:07 crc kubenswrapper[4845]: E0202 10:52:07.035018 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" containerName="ovn-config" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.035093 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" containerName="ovn-config" Feb 02 10:52:07 crc kubenswrapper[4845]: E0202 10:52:07.035228 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b9529c-8c20-47e9-8c19-910a31b30683" containerName="mariadb-account-create-update" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.035277 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b9529c-8c20-47e9-8c19-910a31b30683" containerName="mariadb-account-create-update" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.050651 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b9529c-8c20-47e9-8c19-910a31b30683" containerName="mariadb-account-create-update" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.050702 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" containerName="ovn-config" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.052081 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.088651 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"c2fbcc4a31322cd87e30f6eef32d4520586ba907ceec6affbbcf35b9ccacc481"} Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.089474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"5a8e3bf190cdbf1461764cc3a1076f233459488dc63b68121c7bcff0821c1a42"} Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.089625 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"6d8eda2f411a05e52cc68867032f30a625ae5ae33c0a3cc3a1214aedda45e09b"} Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.108682 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f729k\" (UniqueName: \"kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.109239 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.115689 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h6qld"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.211665 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f729k\" (UniqueName: \"kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.212174 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.213045 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.254532 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f729k\" (UniqueName: \"kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k\") pod \"cinder-db-create-h6qld\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.258870 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-cba3-account-create-update-bph8b"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.264498 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.271826 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.302627 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cba3-account-create-update-bph8b"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.315871 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6csf\" (UniqueName: \"kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.315972 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.328603 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-wnlhd"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.330280 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.375112 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wnlhd"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.393863 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.422107 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6csf\" (UniqueName: \"kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.422227 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.422268 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89jm\" (UniqueName: \"kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.422299 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.423420 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.456503 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jbstq"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.457917 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.460331 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6csf\" (UniqueName: \"kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf\") pod \"cinder-cba3-account-create-update-bph8b\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.466657 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.467120 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.467646 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4r6h5" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.468684 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.478021 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jbstq"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.517476 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-8ggwt"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.518871 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.524392 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.524481 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s89jm\" (UniqueName: \"kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.524519 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.524544 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.524571 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52gp\" (UniqueName: \"kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.525437 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.556853 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8ggwt"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.567629 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89jm\" (UniqueName: \"kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm\") pod \"barbican-db-create-wnlhd\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.577488 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-edbb-account-create-update-gt7ll"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.578783 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.581181 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.595870 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-edbb-account-create-update-gt7ll"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.610453 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.611879 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ee84-account-create-update-f2n87"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.613163 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.617201 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.620207 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ee84-account-create-update-f2n87"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636293 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636387 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49zl\" (UniqueName: \"kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636529 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52gp\" (UniqueName: \"kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636670 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsbw\" (UniqueName: \"kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.636794 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.637092 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.653840 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.654082 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.654786 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.686737 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52gp\" (UniqueName: \"kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp\") pod \"keystone-db-sync-jbstq\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.702098 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-phm7s"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.704317 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739074 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739146 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g49zl\" (UniqueName: \"kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739209 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739234 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsbw\" (UniqueName: \"kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wmdn\" (UniqueName: \"kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739301 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739378 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrbxm\" (UniqueName: \"kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.739422 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.743228 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.743256 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.755142 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb68da2-57ab-4c1b-8e95-e1d434d6cae7" path="/var/lib/kubelet/pods/0bb68da2-57ab-4c1b-8e95-e1d434d6cae7/volumes" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.756222 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-phm7s"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.759394 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsbw\" (UniqueName: \"kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw\") pod \"heat-edbb-account-create-update-gt7ll\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.764558 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49zl\" (UniqueName: \"kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl\") pod \"heat-db-create-8ggwt\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.787081 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.810549 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.826917 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.841685 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrbxm\" (UniqueName: \"kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.841763 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.841925 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.841969 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wmdn\" (UniqueName: \"kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.843337 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.891429 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrbxm\" (UniqueName: \"kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm\") pod \"barbican-ee84-account-create-update-f2n87\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.913157 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3a8c-account-create-update-qmrlh"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.919936 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.922719 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.923625 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.941870 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3a8c-account-create-update-qmrlh"] Feb 02 10:52:07 crc kubenswrapper[4845]: I0202 10:52:07.941990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wmdn\" (UniqueName: \"kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn\") pod \"neutron-db-create-phm7s\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.045998 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.046379 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql6tc\" (UniqueName: \"kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.110038 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h6qld"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.130319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"5e847664210fc8ad1c87b28e652a97e08bd34261194bd9ae9a7266eb27ea4a77"} Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.148341 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql6tc\" (UniqueName: \"kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.148428 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.151709 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.169949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerStarted","Data":"fb8b139b6ecef2c8e8393a96d0038f98cb7a4d3638100daa0e3715cdd7f50c17"} Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.171154 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.229074 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.233192 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql6tc\" (UniqueName: \"kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc\") pod \"neutron-3a8c-account-create-update-qmrlh\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.255943 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.318995 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-wnlhd"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.666745 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-cba3-account-create-update-bph8b"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.699068 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jbstq"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.766359 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-8ggwt"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.907443 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-edbb-account-create-update-gt7ll"] Feb 02 10:52:08 crc kubenswrapper[4845]: I0202 10:52:08.922623 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-phm7s"] Feb 02 10:52:08 crc kubenswrapper[4845]: W0202 10:52:08.926216 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ea84b0_b5f2_4a74_8f6a_67b4176e5d1e.slice/crio-53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4 WatchSource:0}: Error finding container 53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4: Status 404 returned error can't find the container with id 53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4 Feb 02 10:52:09 crc kubenswrapper[4845]: W0202 10:52:09.156625 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb890e0_ca91_4204_8e4b_9036a64e56e1.slice/crio-56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed WatchSource:0}: Error finding container 56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed: Status 404 returned error can't find the container with id 56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.159282 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ee84-account-create-update-f2n87"] Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.173785 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3a8c-account-create-update-qmrlh"] Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.184591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3a8c-account-create-update-qmrlh" event={"ID":"37e0fd8e-0f85-48be-b690-c11e3c09f340","Type":"ContainerStarted","Data":"e0b6da0abcbdb37a5cad2eade5c52fcd8029eb38e9f1aa7e99827abac29549bb"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.185930 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ee84-account-create-update-f2n87" event={"ID":"efb890e0-ca91-4204-8e4b-9036a64e56e1","Type":"ContainerStarted","Data":"56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.188525 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-phm7s" event={"ID":"2cf66acf-0a94-4850-913b-711b19b88dd3","Type":"ContainerStarted","Data":"8a50e4e7b2abb555310e50a778920bc7b8f7c931704bcd0016e368041f8db92c"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.191483 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wnlhd" event={"ID":"afec66f7-184b-44f1-a172-b1e78739309d","Type":"ContainerStarted","Data":"7d589ab274ae62a36101e94b25da0bcc5210eda8997714fcc496cd1866ddd622"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.191522 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wnlhd" event={"ID":"afec66f7-184b-44f1-a172-b1e78739309d","Type":"ContainerStarted","Data":"021b0148be56b73890ed92473db16080cb897f269c2e554a1690906e496aa7cc"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.196709 4845 generic.go:334] "Generic (PLEG): container finished" podID="ad1fe923-0409-4c3c-869c-9d0c09a2506a" containerID="c4fb16737964c587428fe338ca95d6abb864bb90373d40cc0d8bd05a89c69fe2" exitCode=0 Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.196855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h6qld" event={"ID":"ad1fe923-0409-4c3c-869c-9d0c09a2506a","Type":"ContainerDied","Data":"c4fb16737964c587428fe338ca95d6abb864bb90373d40cc0d8bd05a89c69fe2"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.196881 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h6qld" event={"ID":"ad1fe923-0409-4c3c-869c-9d0c09a2506a","Type":"ContainerStarted","Data":"e44bcf835547d78301411ca32122b909f9e1e17afa231378a1944362c26b5d4c"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.199682 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cba3-account-create-update-bph8b" event={"ID":"9af47917-824a-452b-b0db-03ad3f4861df","Type":"ContainerStarted","Data":"c026c97f3f623cb46f500e205081203362ba8f0b275d0368c0ea74ac7d34d244"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.199762 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cba3-account-create-update-bph8b" event={"ID":"9af47917-824a-452b-b0db-03ad3f4861df","Type":"ContainerStarted","Data":"00c59b6d5b21a612fe5716f1f92a3adb28414e415b4e9b69ebd7595359e745c4"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.202140 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jbstq" event={"ID":"967b449a-1414-4a5c-b625-bcaf12b17ade","Type":"ContainerStarted","Data":"56c91e8dc88a7d8da4509ded8923f1a4a18a719a8066a712556a9928447a6799"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.217702 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-wnlhd" podStartSLOduration=2.217680681 podStartE2EDuration="2.217680681s" podCreationTimestamp="2026-02-02 10:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:09.213576004 +0000 UTC m=+1210.304977454" watchObservedRunningTime="2026-02-02 10:52:09.217680681 +0000 UTC m=+1210.309082141" Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.220695 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8ggwt" event={"ID":"367466e2-34f1-4f2c-9e11-eb6c24c5318c","Type":"ContainerStarted","Data":"5317c6760f251bc9751fe010b7bce1cb08e32b0f7d5159c83d81107f454b9348"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.237692 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-edbb-account-create-update-gt7ll" event={"ID":"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e","Type":"ContainerStarted","Data":"53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4"} Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.278589 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-8ggwt" podStartSLOduration=2.27856972 podStartE2EDuration="2.27856972s" podCreationTimestamp="2026-02-02 10:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:09.267650428 +0000 UTC m=+1210.359051878" watchObservedRunningTime="2026-02-02 10:52:09.27856972 +0000 UTC m=+1210.369971170" Feb 02 10:52:09 crc kubenswrapper[4845]: I0202 10:52:09.281608 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-cba3-account-create-update-bph8b" podStartSLOduration=2.281583236 podStartE2EDuration="2.281583236s" podCreationTimestamp="2026-02-02 10:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:09.250655743 +0000 UTC m=+1210.342057193" watchObservedRunningTime="2026-02-02 10:52:09.281583236 +0000 UTC m=+1210.372984686" Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.287596 4845 generic.go:334] "Generic (PLEG): container finished" podID="37e0fd8e-0f85-48be-b690-c11e3c09f340" containerID="a8fcb11c46488e7ab4d44f9b73e21b0ab99aab04b639ff39da8c3dcc0a64fd01" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.288270 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3a8c-account-create-update-qmrlh" event={"ID":"37e0fd8e-0f85-48be-b690-c11e3c09f340","Type":"ContainerDied","Data":"a8fcb11c46488e7ab4d44f9b73e21b0ab99aab04b639ff39da8c3dcc0a64fd01"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.297433 4845 generic.go:334] "Generic (PLEG): container finished" podID="367466e2-34f1-4f2c-9e11-eb6c24c5318c" containerID="51bb841af27a85ca68abff1f32dcaf10c9ab7f03618a7c256ab9040498ec70ed" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.298660 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8ggwt" event={"ID":"367466e2-34f1-4f2c-9e11-eb6c24c5318c","Type":"ContainerDied","Data":"51bb841af27a85ca68abff1f32dcaf10c9ab7f03618a7c256ab9040498ec70ed"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.302131 4845 generic.go:334] "Generic (PLEG): container finished" podID="45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" containerID="b23d76a377763a9782e43474e58cb72f1e55efdf39ac9b7aaca3beca20c268f7" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.302193 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-edbb-account-create-update-gt7ll" event={"ID":"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e","Type":"ContainerDied","Data":"b23d76a377763a9782e43474e58cb72f1e55efdf39ac9b7aaca3beca20c268f7"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.303832 4845 generic.go:334] "Generic (PLEG): container finished" podID="efb890e0-ca91-4204-8e4b-9036a64e56e1" containerID="5c1dc639a0ba7e9ddef0cb628d1e688f84eee0dbcad460df6e423cdfb04749bd" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.303910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ee84-account-create-update-f2n87" event={"ID":"efb890e0-ca91-4204-8e4b-9036a64e56e1","Type":"ContainerDied","Data":"5c1dc639a0ba7e9ddef0cb628d1e688f84eee0dbcad460df6e423cdfb04749bd"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.311018 4845 generic.go:334] "Generic (PLEG): container finished" podID="2cf66acf-0a94-4850-913b-711b19b88dd3" containerID="f8e4a4bf00801e300e8c97b01bb80d6e16ede05a4fe2abe38abfdf7564fa62f4" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.311233 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-phm7s" event={"ID":"2cf66acf-0a94-4850-913b-711b19b88dd3","Type":"ContainerDied","Data":"f8e4a4bf00801e300e8c97b01bb80d6e16ede05a4fe2abe38abfdf7564fa62f4"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.314538 4845 generic.go:334] "Generic (PLEG): container finished" podID="afec66f7-184b-44f1-a172-b1e78739309d" containerID="7d589ab274ae62a36101e94b25da0bcc5210eda8997714fcc496cd1866ddd622" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.314605 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wnlhd" event={"ID":"afec66f7-184b-44f1-a172-b1e78739309d","Type":"ContainerDied","Data":"7d589ab274ae62a36101e94b25da0bcc5210eda8997714fcc496cd1866ddd622"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.334763 4845 generic.go:334] "Generic (PLEG): container finished" podID="9af47917-824a-452b-b0db-03ad3f4861df" containerID="c026c97f3f623cb46f500e205081203362ba8f0b275d0368c0ea74ac7d34d244" exitCode=0 Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.334856 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cba3-account-create-update-bph8b" event={"ID":"9af47917-824a-452b-b0db-03ad3f4861df","Type":"ContainerDied","Data":"c026c97f3f623cb46f500e205081203362ba8f0b275d0368c0ea74ac7d34d244"} Feb 02 10:52:10 crc kubenswrapper[4845]: I0202 10:52:10.996767 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.080729 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f729k\" (UniqueName: \"kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k\") pod \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.081129 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts\") pod \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\" (UID: \"ad1fe923-0409-4c3c-869c-9d0c09a2506a\") " Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.082174 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad1fe923-0409-4c3c-869c-9d0c09a2506a" (UID: "ad1fe923-0409-4c3c-869c-9d0c09a2506a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.088143 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k" (OuterVolumeSpecName: "kube-api-access-f729k") pod "ad1fe923-0409-4c3c-869c-9d0c09a2506a" (UID: "ad1fe923-0409-4c3c-869c-9d0c09a2506a"). InnerVolumeSpecName "kube-api-access-f729k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.183490 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1fe923-0409-4c3c-869c-9d0c09a2506a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.183528 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f729k\" (UniqueName: \"kubernetes.io/projected/ad1fe923-0409-4c3c-869c-9d0c09a2506a-kube-api-access-f729k\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.354986 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"74923277cb3713500b97089277d8f20c5a6f124a4bd6af7f533053a052e8bb3a"} Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.355256 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"405c922bf87f1bef0ae98c722bd475e41e4a19d9656c4a5e8d7d1ca39f678583"} Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.355267 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"24917214958f74c9af404d02d87263286e2f56621b9c258d51a89d82c92b4fc4"} Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.357708 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h6qld" event={"ID":"ad1fe923-0409-4c3c-869c-9d0c09a2506a","Type":"ContainerDied","Data":"e44bcf835547d78301411ca32122b909f9e1e17afa231378a1944362c26b5d4c"} Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.357747 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44bcf835547d78301411ca32122b909f9e1e17afa231378a1944362c26b5d4c" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.357764 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h6qld" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.748929 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.921746 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts\") pod \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.921960 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49zl\" (UniqueName: \"kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl\") pod \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\" (UID: \"367466e2-34f1-4f2c-9e11-eb6c24c5318c\") " Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.922217 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "367466e2-34f1-4f2c-9e11-eb6c24c5318c" (UID: "367466e2-34f1-4f2c-9e11-eb6c24c5318c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.923028 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/367466e2-34f1-4f2c-9e11-eb6c24c5318c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:11 crc kubenswrapper[4845]: I0202 10:52:11.934235 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl" (OuterVolumeSpecName: "kube-api-access-g49zl") pod "367466e2-34f1-4f2c-9e11-eb6c24c5318c" (UID: "367466e2-34f1-4f2c-9e11-eb6c24c5318c"). InnerVolumeSpecName "kube-api-access-g49zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.018474 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.025265 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g49zl\" (UniqueName: \"kubernetes.io/projected/367466e2-34f1-4f2c-9e11-eb6c24c5318c-kube-api-access-g49zl\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.027749 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.126319 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts\") pod \"afec66f7-184b-44f1-a172-b1e78739309d\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.126400 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrbxm\" (UniqueName: \"kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm\") pod \"efb890e0-ca91-4204-8e4b-9036a64e56e1\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.126434 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s89jm\" (UniqueName: \"kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm\") pod \"afec66f7-184b-44f1-a172-b1e78739309d\" (UID: \"afec66f7-184b-44f1-a172-b1e78739309d\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.126670 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts\") pod \"efb890e0-ca91-4204-8e4b-9036a64e56e1\" (UID: \"efb890e0-ca91-4204-8e4b-9036a64e56e1\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.127621 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efb890e0-ca91-4204-8e4b-9036a64e56e1" (UID: "efb890e0-ca91-4204-8e4b-9036a64e56e1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.128495 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afec66f7-184b-44f1-a172-b1e78739309d" (UID: "afec66f7-184b-44f1-a172-b1e78739309d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.133252 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm" (OuterVolumeSpecName: "kube-api-access-s89jm") pod "afec66f7-184b-44f1-a172-b1e78739309d" (UID: "afec66f7-184b-44f1-a172-b1e78739309d"). InnerVolumeSpecName "kube-api-access-s89jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.134261 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm" (OuterVolumeSpecName: "kube-api-access-vrbxm") pod "efb890e0-ca91-4204-8e4b-9036a64e56e1" (UID: "efb890e0-ca91-4204-8e4b-9036a64e56e1"). InnerVolumeSpecName "kube-api-access-vrbxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.230259 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afec66f7-184b-44f1-a172-b1e78739309d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.230311 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrbxm\" (UniqueName: \"kubernetes.io/projected/efb890e0-ca91-4204-8e4b-9036a64e56e1-kube-api-access-vrbxm\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.230328 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s89jm\" (UniqueName: \"kubernetes.io/projected/afec66f7-184b-44f1-a172-b1e78739309d-kube-api-access-s89jm\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.230344 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efb890e0-ca91-4204-8e4b-9036a64e56e1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.339087 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.363085 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.369861 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.390984 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ee84-account-create-update-f2n87" event={"ID":"efb890e0-ca91-4204-8e4b-9036a64e56e1","Type":"ContainerDied","Data":"56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.391033 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56770c5d385d67ae06726838ff4816cd3884482b8958ac9f7af99eb30324dfed" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.391110 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ee84-account-create-update-f2n87" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.403270 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-phm7s" event={"ID":"2cf66acf-0a94-4850-913b-711b19b88dd3","Type":"ContainerDied","Data":"8a50e4e7b2abb555310e50a778920bc7b8f7c931704bcd0016e368041f8db92c"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.403319 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a50e4e7b2abb555310e50a778920bc7b8f7c931704bcd0016e368041f8db92c" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.403393 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-phm7s" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.410302 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-wnlhd" event={"ID":"afec66f7-184b-44f1-a172-b1e78739309d","Type":"ContainerDied","Data":"021b0148be56b73890ed92473db16080cb897f269c2e554a1690906e496aa7cc"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.410359 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021b0148be56b73890ed92473db16080cb897f269c2e554a1690906e496aa7cc" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.410438 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-wnlhd" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.420037 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"d1a41540df997fa052831691a5536c71283d086d443f0639d3997a30859d370e"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.422090 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3a8c-account-create-update-qmrlh" event={"ID":"37e0fd8e-0f85-48be-b690-c11e3c09f340","Type":"ContainerDied","Data":"e0b6da0abcbdb37a5cad2eade5c52fcd8029eb38e9f1aa7e99827abac29549bb"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.422143 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0b6da0abcbdb37a5cad2eade5c52fcd8029eb38e9f1aa7e99827abac29549bb" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.422107 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3a8c-account-create-update-qmrlh" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.423310 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-8ggwt" event={"ID":"367466e2-34f1-4f2c-9e11-eb6c24c5318c","Type":"ContainerDied","Data":"5317c6760f251bc9751fe010b7bce1cb08e32b0f7d5159c83d81107f454b9348"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.423341 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5317c6760f251bc9751fe010b7bce1cb08e32b0f7d5159c83d81107f454b9348" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.423368 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-8ggwt" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.427635 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-edbb-account-create-update-gt7ll" event={"ID":"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e","Type":"ContainerDied","Data":"53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4"} Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.427679 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53c8908fbcc0591fbfb69956d0c0c3109bf14ab8287ebd837545ea519acfd6d4" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.427732 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-edbb-account-create-update-gt7ll" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.433978 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts\") pod \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.434232 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsbw\" (UniqueName: \"kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw\") pod \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\" (UID: \"45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.434684 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" (UID: "45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.435051 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.439566 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw" (OuterVolumeSpecName: "kube-api-access-thsbw") pod "45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" (UID: "45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e"). InnerVolumeSpecName "kube-api-access-thsbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.536497 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wmdn\" (UniqueName: \"kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn\") pod \"2cf66acf-0a94-4850-913b-711b19b88dd3\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.536767 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts\") pod \"2cf66acf-0a94-4850-913b-711b19b88dd3\" (UID: \"2cf66acf-0a94-4850-913b-711b19b88dd3\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.536836 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts\") pod \"37e0fd8e-0f85-48be-b690-c11e3c09f340\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.537035 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql6tc\" (UniqueName: \"kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc\") pod \"37e0fd8e-0f85-48be-b690-c11e3c09f340\" (UID: \"37e0fd8e-0f85-48be-b690-c11e3c09f340\") " Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.538108 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2cf66acf-0a94-4850-913b-711b19b88dd3" (UID: "2cf66acf-0a94-4850-913b-711b19b88dd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.538133 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37e0fd8e-0f85-48be-b690-c11e3c09f340" (UID: "37e0fd8e-0f85-48be-b690-c11e3c09f340"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.538867 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37e0fd8e-0f85-48be-b690-c11e3c09f340-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.538916 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsbw\" (UniqueName: \"kubernetes.io/projected/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e-kube-api-access-thsbw\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.538932 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2cf66acf-0a94-4850-913b-711b19b88dd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.542077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn" (OuterVolumeSpecName: "kube-api-access-9wmdn") pod "2cf66acf-0a94-4850-913b-711b19b88dd3" (UID: "2cf66acf-0a94-4850-913b-711b19b88dd3"). InnerVolumeSpecName "kube-api-access-9wmdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.542132 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc" (OuterVolumeSpecName: "kube-api-access-ql6tc") pod "37e0fd8e-0f85-48be-b690-c11e3c09f340" (UID: "37e0fd8e-0f85-48be-b690-c11e3c09f340"). InnerVolumeSpecName "kube-api-access-ql6tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.641154 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql6tc\" (UniqueName: \"kubernetes.io/projected/37e0fd8e-0f85-48be-b690-c11e3c09f340-kube-api-access-ql6tc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:12 crc kubenswrapper[4845]: I0202 10:52:12.641196 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wmdn\" (UniqueName: \"kubernetes.io/projected/2cf66acf-0a94-4850-913b-711b19b88dd3-kube-api-access-9wmdn\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4845]: I0202 10:52:13.437736 4845 generic.go:334] "Generic (PLEG): container finished" podID="31859db3-3de0-46d0-a81b-b951f1d45279" containerID="fb8b139b6ecef2c8e8393a96d0038f98cb7a4d3638100daa0e3715cdd7f50c17" exitCode=0 Feb 02 10:52:13 crc kubenswrapper[4845]: I0202 10:52:13.438186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerDied","Data":"fb8b139b6ecef2c8e8393a96d0038f98cb7a4d3638100daa0e3715cdd7f50c17"} Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.473010 4845 generic.go:334] "Generic (PLEG): container finished" podID="34877df4-b654-4e0c-ac67-da6fd95c249d" containerID="3a7ec3be2e02a83d1849c451b822789235edc5a4672079139f59894d6d036a70" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.473126 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgn95" event={"ID":"34877df4-b654-4e0c-ac67-da6fd95c249d","Type":"ContainerDied","Data":"3a7ec3be2e02a83d1849c451b822789235edc5a4672079139f59894d6d036a70"} Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.478384 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-cba3-account-create-update-bph8b" event={"ID":"9af47917-824a-452b-b0db-03ad3f4861df","Type":"ContainerDied","Data":"00c59b6d5b21a612fe5716f1f92a3adb28414e415b4e9b69ebd7595359e745c4"} Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.478413 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c59b6d5b21a612fe5716f1f92a3adb28414e415b4e9b69ebd7595359e745c4" Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.527671 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.631605 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts\") pod \"9af47917-824a-452b-b0db-03ad3f4861df\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.631903 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6csf\" (UniqueName: \"kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf\") pod \"9af47917-824a-452b-b0db-03ad3f4861df\" (UID: \"9af47917-824a-452b-b0db-03ad3f4861df\") " Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.632321 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9af47917-824a-452b-b0db-03ad3f4861df" (UID: "9af47917-824a-452b-b0db-03ad3f4861df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.632551 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9af47917-824a-452b-b0db-03ad3f4861df-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.642226 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf" (OuterVolumeSpecName: "kube-api-access-g6csf") pod "9af47917-824a-452b-b0db-03ad3f4861df" (UID: "9af47917-824a-452b-b0db-03ad3f4861df"). InnerVolumeSpecName "kube-api-access-g6csf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:16 crc kubenswrapper[4845]: I0202 10:52:16.734035 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6csf\" (UniqueName: \"kubernetes.io/projected/9af47917-824a-452b-b0db-03ad3f4861df-kube-api-access-g6csf\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.489824 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jbstq" event={"ID":"967b449a-1414-4a5c-b625-bcaf12b17ade","Type":"ContainerStarted","Data":"493cf0ea40e1ecb4c2bc2c0fc9bcd32cc6e220ecfec73e51aec90faf9abebac3"} Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.493382 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerStarted","Data":"bd655aec5adcf4557d3cd7bb2d2b2176da4d03f540d31774ecb69beed6ccf9fd"} Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.509807 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jbstq" podStartSLOduration=2.293468026 podStartE2EDuration="10.509792149s" podCreationTimestamp="2026-02-02 10:52:07 +0000 UTC" firstStartedPulling="2026-02-02 10:52:08.641943174 +0000 UTC m=+1209.733344624" lastFinishedPulling="2026-02-02 10:52:16.858267297 +0000 UTC m=+1217.949668747" observedRunningTime="2026-02-02 10:52:17.506761193 +0000 UTC m=+1218.598162633" watchObservedRunningTime="2026-02-02 10:52:17.509792149 +0000 UTC m=+1218.601193599" Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.511105 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-cba3-account-create-update-bph8b" Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.517013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"ab146fc1a35a7273f26471788eb3651c45f72d8d7bced303478fe4e7b486106b"} Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.517069 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"81be3a13f800d5dc0821464060f981d73bf8707d121acef9872d7cbd7e764eac"} Feb 02 10:52:17 crc kubenswrapper[4845]: I0202 10:52:17.517083 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"f18e574b0772d18c72da2365bbb00a93f5efd2703c044c947bd63bedf35aaa7e"} Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.173557 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgn95" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.270274 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle\") pod \"34877df4-b654-4e0c-ac67-da6fd95c249d\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.270340 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkc25\" (UniqueName: \"kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25\") pod \"34877df4-b654-4e0c-ac67-da6fd95c249d\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.270431 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data\") pod \"34877df4-b654-4e0c-ac67-da6fd95c249d\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.270583 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data\") pod \"34877df4-b654-4e0c-ac67-da6fd95c249d\" (UID: \"34877df4-b654-4e0c-ac67-da6fd95c249d\") " Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.274568 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "34877df4-b654-4e0c-ac67-da6fd95c249d" (UID: "34877df4-b654-4e0c-ac67-da6fd95c249d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.276918 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25" (OuterVolumeSpecName: "kube-api-access-rkc25") pod "34877df4-b654-4e0c-ac67-da6fd95c249d" (UID: "34877df4-b654-4e0c-ac67-da6fd95c249d"). InnerVolumeSpecName "kube-api-access-rkc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.321078 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34877df4-b654-4e0c-ac67-da6fd95c249d" (UID: "34877df4-b654-4e0c-ac67-da6fd95c249d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.332272 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data" (OuterVolumeSpecName: "config-data") pod "34877df4-b654-4e0c-ac67-da6fd95c249d" (UID: "34877df4-b654-4e0c-ac67-da6fd95c249d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.374368 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.374498 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.374510 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkc25\" (UniqueName: \"kubernetes.io/projected/34877df4-b654-4e0c-ac67-da6fd95c249d-kube-api-access-rkc25\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.374521 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34877df4-b654-4e0c-ac67-da6fd95c249d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.523431 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-kgn95" event={"ID":"34877df4-b654-4e0c-ac67-da6fd95c249d","Type":"ContainerDied","Data":"46b3e4f9d3e3e603b74f4066b84d76558af5b054c23ef7c7b9de906dee295d9c"} Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.523507 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b3e4f9d3e3e603b74f4066b84d76558af5b054c23ef7c7b9de906dee295d9c" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.523468 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-kgn95" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.530069 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"70063302b184916c235c5fd4a1532435663c441d53879f700c408590babab027"} Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.530403 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"bc9250ab28e8c7cbc5470cf90b33e00f598a4fe8e78e976f916ac1aaa79460c6"} Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.903957 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909792 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb890e0-ca91-4204-8e4b-9036a64e56e1" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909827 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb890e0-ca91-4204-8e4b-9036a64e56e1" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909841 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af47917-824a-452b-b0db-03ad3f4861df" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909850 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af47917-824a-452b-b0db-03ad3f4861df" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909857 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf66acf-0a94-4850-913b-711b19b88dd3" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909864 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf66acf-0a94-4850-913b-711b19b88dd3" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909907 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34877df4-b654-4e0c-ac67-da6fd95c249d" containerName="glance-db-sync" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909914 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="34877df4-b654-4e0c-ac67-da6fd95c249d" containerName="glance-db-sync" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909924 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e0fd8e-0f85-48be-b690-c11e3c09f340" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909930 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e0fd8e-0f85-48be-b690-c11e3c09f340" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909943 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1fe923-0409-4c3c-869c-9d0c09a2506a" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909949 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1fe923-0409-4c3c-869c-9d0c09a2506a" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909964 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367466e2-34f1-4f2c-9e11-eb6c24c5318c" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909969 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="367466e2-34f1-4f2c-9e11-eb6c24c5318c" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.909988 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afec66f7-184b-44f1-a172-b1e78739309d" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.909996 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="afec66f7-184b-44f1-a172-b1e78739309d" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: E0202 10:52:18.910007 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910013 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910315 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af47917-824a-452b-b0db-03ad3f4861df" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910331 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb890e0-ca91-4204-8e4b-9036a64e56e1" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910349 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="34877df4-b654-4e0c-ac67-da6fd95c249d" containerName="glance-db-sync" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910359 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1fe923-0409-4c3c-869c-9d0c09a2506a" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910373 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910380 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf66acf-0a94-4850-913b-711b19b88dd3" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910390 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="367466e2-34f1-4f2c-9e11-eb6c24c5318c" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910398 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e0fd8e-0f85-48be-b690-c11e3c09f340" containerName="mariadb-account-create-update" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.910409 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="afec66f7-184b-44f1-a172-b1e78739309d" containerName="mariadb-database-create" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.911487 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.931252 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.997106 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.997182 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.997219 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwhwv\" (UniqueName: \"kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.997255 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:18 crc kubenswrapper[4845]: I0202 10:52:18.997452 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.099828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwhwv\" (UniqueName: \"kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.099933 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.100050 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.100137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.100212 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.101280 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.103206 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.103845 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.104656 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.132023 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwhwv\" (UniqueName: \"kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv\") pod \"dnsmasq-dns-5b946c75cc-g827z\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.268911 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.567671 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"6e450f575a10ed5eb53c16a2f8cfb924b06dfa9ed4395c0b603f21cf26457698"} Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.567940 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d6db6e42-984a-484b-9f90-e6efa9817f37","Type":"ContainerStarted","Data":"02c633fbb7c37eada8bc13c59416fc56c82d489fed906a675bc9ca4e06f4e1dc"} Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.653812 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.27389171 podStartE2EDuration="50.653787475s" podCreationTimestamp="2026-02-02 10:51:29 +0000 UTC" firstStartedPulling="2026-02-02 10:52:03.477064574 +0000 UTC m=+1204.568466024" lastFinishedPulling="2026-02-02 10:52:16.856960339 +0000 UTC m=+1217.948361789" observedRunningTime="2026-02-02 10:52:19.64206258 +0000 UTC m=+1220.733464020" watchObservedRunningTime="2026-02-02 10:52:19.653787475 +0000 UTC m=+1220.745188925" Feb 02 10:52:19 crc kubenswrapper[4845]: W0202 10:52:19.821158 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0a38136_159d_482a_988e_07f3b77fdbb4.slice/crio-a82710ddcd829b8f907c2ebc80d76437306ca2f6502c542d653b11393fa4153c WatchSource:0}: Error finding container a82710ddcd829b8f907c2ebc80d76437306ca2f6502c542d653b11393fa4153c: Status 404 returned error can't find the container with id a82710ddcd829b8f907c2ebc80d76437306ca2f6502c542d653b11393fa4153c Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.821404 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.916612 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.958486 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.961459 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.967328 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 02 10:52:19 crc kubenswrapper[4845]: I0202 10:52:19.992034 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.031869 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.032002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.032038 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.032110 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.032133 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdm58\" (UniqueName: \"kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.032292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137254 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137330 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137412 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137439 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdm58\" (UniqueName: \"kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.137542 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.138366 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.138411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.139084 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.139143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.139774 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.196003 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdm58\" (UniqueName: \"kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58\") pod \"dnsmasq-dns-74f6bcbc87-g5r95\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.218921 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.582578 4845 generic.go:334] "Generic (PLEG): container finished" podID="e0a38136-159d-482a-988e-07f3b77fdbb4" containerID="f21c9d9931f7cb69bc5973a19397e3c42576eb648cfa1130b5861a6b72c96acd" exitCode=0 Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.582770 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" event={"ID":"e0a38136-159d-482a-988e-07f3b77fdbb4","Type":"ContainerDied","Data":"f21c9d9931f7cb69bc5973a19397e3c42576eb648cfa1130b5861a6b72c96acd"} Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.582970 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" event={"ID":"e0a38136-159d-482a-988e-07f3b77fdbb4","Type":"ContainerStarted","Data":"a82710ddcd829b8f907c2ebc80d76437306ca2f6502c542d653b11393fa4153c"} Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.593321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerStarted","Data":"becbf6e82dc65913f8e07cd6976d63486bd12ef17a78f01ee8609cf3cf55427b"} Feb 02 10:52:20 crc kubenswrapper[4845]: I0202 10:52:20.872284 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:20 crc kubenswrapper[4845]: W0202 10:52:20.880041 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc6155c0_72ff_4f97_9748_716e3df8ad88.slice/crio-4cc1cd6ad5cc574e57eca193db9f213196ac2057c994f0da0d19824fee8e97c1 WatchSource:0}: Error finding container 4cc1cd6ad5cc574e57eca193db9f213196ac2057c994f0da0d19824fee8e97c1: Status 404 returned error can't find the container with id 4cc1cd6ad5cc574e57eca193db9f213196ac2057c994f0da0d19824fee8e97c1 Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.071983 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.165386 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb\") pod \"e0a38136-159d-482a-988e-07f3b77fdbb4\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.165478 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config\") pod \"e0a38136-159d-482a-988e-07f3b77fdbb4\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.165551 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb\") pod \"e0a38136-159d-482a-988e-07f3b77fdbb4\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.165672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc\") pod \"e0a38136-159d-482a-988e-07f3b77fdbb4\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.165712 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwhwv\" (UniqueName: \"kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv\") pod \"e0a38136-159d-482a-988e-07f3b77fdbb4\" (UID: \"e0a38136-159d-482a-988e-07f3b77fdbb4\") " Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.171073 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv" (OuterVolumeSpecName: "kube-api-access-vwhwv") pod "e0a38136-159d-482a-988e-07f3b77fdbb4" (UID: "e0a38136-159d-482a-988e-07f3b77fdbb4"). InnerVolumeSpecName "kube-api-access-vwhwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.194562 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e0a38136-159d-482a-988e-07f3b77fdbb4" (UID: "e0a38136-159d-482a-988e-07f3b77fdbb4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.195201 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e0a38136-159d-482a-988e-07f3b77fdbb4" (UID: "e0a38136-159d-482a-988e-07f3b77fdbb4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.196637 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config" (OuterVolumeSpecName: "config") pod "e0a38136-159d-482a-988e-07f3b77fdbb4" (UID: "e0a38136-159d-482a-988e-07f3b77fdbb4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.198198 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e0a38136-159d-482a-988e-07f3b77fdbb4" (UID: "e0a38136-159d-482a-988e-07f3b77fdbb4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.269192 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.269244 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwhwv\" (UniqueName: \"kubernetes.io/projected/e0a38136-159d-482a-988e-07f3b77fdbb4-kube-api-access-vwhwv\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.269259 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.269273 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.269284 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e0a38136-159d-482a-988e-07f3b77fdbb4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.603127 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.603165 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" event={"ID":"e0a38136-159d-482a-988e-07f3b77fdbb4","Type":"ContainerDied","Data":"a82710ddcd829b8f907c2ebc80d76437306ca2f6502c542d653b11393fa4153c"} Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.604160 4845 scope.go:117] "RemoveContainer" containerID="f21c9d9931f7cb69bc5973a19397e3c42576eb648cfa1130b5861a6b72c96acd" Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.611700 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"31859db3-3de0-46d0-a81b-b951f1d45279","Type":"ContainerStarted","Data":"b0b33f8b69d3389ea6162be514e962a2eb636734d57bb003836896c3d935c8e1"} Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.615798 4845 generic.go:334] "Generic (PLEG): container finished" podID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerID="3f008994368901472bc0803e0e3411d898bba47e6f98a3c829832d620028583b" exitCode=0 Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.615858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" event={"ID":"cc6155c0-72ff-4f97-9748-716e3df8ad88","Type":"ContainerDied","Data":"3f008994368901472bc0803e0e3411d898bba47e6f98a3c829832d620028583b"} Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.615905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" event={"ID":"cc6155c0-72ff-4f97-9748-716e3df8ad88","Type":"ContainerStarted","Data":"4cc1cd6ad5cc574e57eca193db9f213196ac2057c994f0da0d19824fee8e97c1"} Feb 02 10:52:21 crc kubenswrapper[4845]: I0202 10:52:21.661905 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.66186759 podStartE2EDuration="19.66186759s" podCreationTimestamp="2026-02-02 10:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:21.652506863 +0000 UTC m=+1222.743908313" watchObservedRunningTime="2026-02-02 10:52:21.66186759 +0000 UTC m=+1222.753269040" Feb 02 10:52:22 crc kubenswrapper[4845]: I0202 10:52:22.627910 4845 generic.go:334] "Generic (PLEG): container finished" podID="967b449a-1414-4a5c-b625-bcaf12b17ade" containerID="493cf0ea40e1ecb4c2bc2c0fc9bcd32cc6e220ecfec73e51aec90faf9abebac3" exitCode=0 Feb 02 10:52:22 crc kubenswrapper[4845]: I0202 10:52:22.627993 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jbstq" event={"ID":"967b449a-1414-4a5c-b625-bcaf12b17ade","Type":"ContainerDied","Data":"493cf0ea40e1ecb4c2bc2c0fc9bcd32cc6e220ecfec73e51aec90faf9abebac3"} Feb 02 10:52:22 crc kubenswrapper[4845]: I0202 10:52:22.633426 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" event={"ID":"cc6155c0-72ff-4f97-9748-716e3df8ad88","Type":"ContainerStarted","Data":"2d2c3127f10566f1a93e004f0e822b849cbd5d19678d12b2cc9041ee23459162"} Feb 02 10:52:22 crc kubenswrapper[4845]: I0202 10:52:22.679249 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" podStartSLOduration=3.679229758 podStartE2EDuration="3.679229758s" podCreationTimestamp="2026-02-02 10:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:22.674864614 +0000 UTC m=+1223.766266054" watchObservedRunningTime="2026-02-02 10:52:22.679229758 +0000 UTC m=+1223.770631208" Feb 02 10:52:23 crc kubenswrapper[4845]: I0202 10:52:23.057471 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:23 crc kubenswrapper[4845]: I0202 10:52:23.644378 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.119316 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.247773 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n52gp\" (UniqueName: \"kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp\") pod \"967b449a-1414-4a5c-b625-bcaf12b17ade\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.248135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle\") pod \"967b449a-1414-4a5c-b625-bcaf12b17ade\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.248170 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data\") pod \"967b449a-1414-4a5c-b625-bcaf12b17ade\" (UID: \"967b449a-1414-4a5c-b625-bcaf12b17ade\") " Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.257228 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp" (OuterVolumeSpecName: "kube-api-access-n52gp") pod "967b449a-1414-4a5c-b625-bcaf12b17ade" (UID: "967b449a-1414-4a5c-b625-bcaf12b17ade"). InnerVolumeSpecName "kube-api-access-n52gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.305230 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "967b449a-1414-4a5c-b625-bcaf12b17ade" (UID: "967b449a-1414-4a5c-b625-bcaf12b17ade"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.311055 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data" (OuterVolumeSpecName: "config-data") pod "967b449a-1414-4a5c-b625-bcaf12b17ade" (UID: "967b449a-1414-4a5c-b625-bcaf12b17ade"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.350406 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.350447 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/967b449a-1414-4a5c-b625-bcaf12b17ade-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.350460 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n52gp\" (UniqueName: \"kubernetes.io/projected/967b449a-1414-4a5c-b625-bcaf12b17ade-kube-api-access-n52gp\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.654138 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jbstq" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.654127 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jbstq" event={"ID":"967b449a-1414-4a5c-b625-bcaf12b17ade","Type":"ContainerDied","Data":"56c91e8dc88a7d8da4509ded8923f1a4a18a719a8066a712556a9928447a6799"} Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.654295 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c91e8dc88a7d8da4509ded8923f1a4a18a719a8066a712556a9928447a6799" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.911615 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.950701 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:24 crc kubenswrapper[4845]: E0202 10:52:24.951287 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967b449a-1414-4a5c-b625-bcaf12b17ade" containerName="keystone-db-sync" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.951309 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="967b449a-1414-4a5c-b625-bcaf12b17ade" containerName="keystone-db-sync" Feb 02 10:52:24 crc kubenswrapper[4845]: E0202 10:52:24.951325 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a38136-159d-482a-988e-07f3b77fdbb4" containerName="init" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.951334 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a38136-159d-482a-988e-07f3b77fdbb4" containerName="init" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.951607 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a38136-159d-482a-988e-07f3b77fdbb4" containerName="init" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.951630 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="967b449a-1414-4a5c-b625-bcaf12b17ade" containerName="keystone-db-sync" Feb 02 10:52:24 crc kubenswrapper[4845]: I0202 10:52:24.962381 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.017855 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-glpvf"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.025589 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.049759 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.050081 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.050275 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4r6h5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.050498 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.071135 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.134873 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225662 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgg4z\" (UniqueName: \"kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225694 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225709 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225733 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225750 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225802 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225833 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225898 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225915 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.225932 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sgx4\" (UniqueName: \"kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.232111 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-glpvf"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.283586 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-kxrm5"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.284989 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.317374 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxrm5"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.321310 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.321533 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2czql" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327609 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327663 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327722 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327744 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327761 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sgx4\" (UniqueName: \"kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327798 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327817 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjs7z\" (UniqueName: \"kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327850 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.327993 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328022 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgg4z\" (UniqueName: \"kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328054 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328073 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328098 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328120 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.328137 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.335688 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-cpdt4"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.338231 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.341194 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.341791 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.344969 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.345428 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.346040 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.353278 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cpdt4"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.360847 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tjnkn" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.361068 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.361199 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.372347 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.372436 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.375758 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.377541 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.378961 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.432933 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sgx4\" (UniqueName: \"kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4\") pod \"keystone-bootstrap-glpvf\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.435013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.435087 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.435330 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjs7z\" (UniqueName: \"kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.436521 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgg4z\" (UniqueName: \"kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z\") pod \"dnsmasq-dns-847c4cc679-v7wns\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.445631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.454010 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.525809 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjs7z\" (UniqueName: \"kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z\") pod \"heat-db-sync-kxrm5\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.547535 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.547602 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mpk6\" (UniqueName: \"kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.547672 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.612510 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g8b4r"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.615555 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.623960 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxrm5" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.642043 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.643083 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651590 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflw4\" (UniqueName: \"kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651672 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651706 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651751 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651775 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651812 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mpk6\" (UniqueName: \"kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.651973 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.652079 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.658740 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2glkz" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.658999 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.659171 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.670009 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.678461 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.690071 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g8b4r"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.704252 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="dnsmasq-dns" containerID="cri-o://2d2c3127f10566f1a93e004f0e822b849cbd5d19678d12b2cc9041ee23459162" gracePeriod=10 Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.709114 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mpk6\" (UniqueName: \"kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6\") pod \"neutron-db-sync-cpdt4\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.712587 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.754634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.756656 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.756785 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflw4\" (UniqueName: \"kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.756852 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.756894 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.756942 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.760254 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.785429 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.786539 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.802331 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.802456 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gxjbc"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.804059 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.805432 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflw4\" (UniqueName: \"kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.818418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data\") pod \"cinder-db-sync-g8b4r\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.843714 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kzmx2" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.843946 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.866383 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gxjbc"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.870821 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.870872 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.870946 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.871002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4fj\" (UniqueName: \"kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.871032 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.872210 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.900521 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.902851 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.919938 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.920748 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-hft5g"] Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.925798 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.936997 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.937228 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rddwg" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.947954 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976724 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjfj8\" (UniqueName: \"kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976784 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976809 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976839 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4fj\" (UniqueName: \"kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976872 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.976984 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977052 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977095 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977120 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977142 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977168 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcx2l\" (UniqueName: \"kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977195 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977229 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977243 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.977504 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.983836 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:25 crc kubenswrapper[4845]: I0202 10:52:25.991799 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:25.999579 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hft5g"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.012464 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.028407 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4fj\" (UniqueName: \"kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj\") pod \"placement-db-sync-gxjbc\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.077526 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079096 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079173 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjfj8\" (UniqueName: \"kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079222 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079257 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079326 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079426 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079455 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079488 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcx2l\" (UniqueName: \"kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.079515 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.081519 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.085237 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.086007 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.086751 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.087801 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.099558 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.104831 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.135961 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.138240 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcx2l\" (UniqueName: \"kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l\") pod \"dnsmasq-dns-785d8bcb8c-sbm2k\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.138975 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.143959 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.144391 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.157825 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.162586 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjfj8\" (UniqueName: \"kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8\") pod \"barbican-db-sync-hft5g\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.201045 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gxjbc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.246435 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.282160 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hft5g" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.299519 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.299711 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.299752 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.299791 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7v5\" (UniqueName: \"kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.299969 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.302692 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.302758 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.351874 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.354954 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.361521 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.364482 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.364874 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.365176 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-snsd2" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.365640 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404479 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404525 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404639 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404658 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404677 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7v5\" (UniqueName: \"kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.404744 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.412530 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.412863 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.417570 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.419362 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.433360 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.444063 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.447004 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7v5\" (UniqueName: \"kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5\") pod \"ceilometer-0\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.503428 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508376 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508429 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508506 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508539 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508571 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508645 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4hxp\" (UniqueName: \"kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508678 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.508741 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.515900 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.527851 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.538645 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.541079 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.559186 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623077 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623152 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623217 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623288 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623322 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623382 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623415 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623436 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623469 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623501 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623554 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcf6\" (UniqueName: \"kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623595 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623646 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4hxp\" (UniqueName: \"kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623678 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.623706 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.628282 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.629021 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.632206 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.632537 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.641834 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.656332 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.656383 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ed60d0e4f5ad2fb51e67cadb4519054184ad51c31b402d173121c9411d32387/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.657354 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.658592 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4hxp\" (UniqueName: \"kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.727813 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.727876 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.727978 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxcf6\" (UniqueName: \"kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.728010 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.728033 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.728095 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.728171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.728217 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.735103 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.735360 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.739718 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.739757 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a65ad887e305c92ac1c8235cc9c5fc327f1ea7ce91b9974356e11ee00bc2f81/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.740098 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.744008 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.747220 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.753853 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.779874 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxcf6\" (UniqueName: \"kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.797267 4845 generic.go:334] "Generic (PLEG): container finished" podID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerID="2d2c3127f10566f1a93e004f0e822b849cbd5d19678d12b2cc9041ee23459162" exitCode=0 Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.797342 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" event={"ID":"cc6155c0-72ff-4f97-9748-716e3df8ad88","Type":"ContainerDied","Data":"2d2c3127f10566f1a93e004f0e822b849cbd5d19678d12b2cc9041ee23459162"} Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.840146 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.889231 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.896059 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:26 crc kubenswrapper[4845]: I0202 10:52:26.923428 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.050626 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.051000 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.051039 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.051532 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.051786 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdm58\" (UniqueName: \"kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.051850 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc\") pod \"cc6155c0-72ff-4f97-9748-716e3df8ad88\" (UID: \"cc6155c0-72ff-4f97-9748-716e3df8ad88\") " Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.064989 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58" (OuterVolumeSpecName: "kube-api-access-hdm58") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "kube-api-access-hdm58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.154003 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.160328 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdm58\" (UniqueName: \"kubernetes.io/projected/cc6155c0-72ff-4f97-9748-716e3df8ad88-kube-api-access-hdm58\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.177777 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.200550 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.206303 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.218850 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.219537 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.241085 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxrm5"] Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.246001 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config" (OuterVolumeSpecName: "config") pod "cc6155c0-72ff-4f97-9748-716e3df8ad88" (UID: "cc6155c0-72ff-4f97-9748-716e3df8ad88"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.263716 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-glpvf"] Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.264131 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.264156 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.264192 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.264202 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.264216 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc6155c0-72ff-4f97-9748-716e3df8ad88-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:27 crc kubenswrapper[4845]: W0202 10:52:27.287029 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod250e18d9_cb14_4309_8d0c_fb341511dba6.slice/crio-17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc WatchSource:0}: Error finding container 17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc: Status 404 returned error can't find the container with id 17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc Feb 02 10:52:27 crc kubenswrapper[4845]: W0202 10:52:27.290535 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod250ffbd9_33d6_4a0d_b812_1d092341d4f9.slice/crio-de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921 WatchSource:0}: Error finding container de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921: Status 404 returned error can't find the container with id de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921 Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.887732 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.925574 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-glpvf" event={"ID":"250ffbd9-33d6-4a0d-b812-1d092341d4f9","Type":"ContainerStarted","Data":"cc9a388001d07f3511088bc5b867a073370c23dfc566a76ed6b25957f4cd9611"} Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.925620 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-glpvf" event={"ID":"250ffbd9-33d6-4a0d-b812-1d092341d4f9","Type":"ContainerStarted","Data":"de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921"} Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.936261 4845 generic.go:334] "Generic (PLEG): container finished" podID="4dc8c937-0b9b-461a-be1b-02bdb587b70e" containerID="237eb45884084c45016b5f46a907fcfa367aa8dbef9b4eddbe88c4430a23713e" exitCode=0 Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.936318 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" event={"ID":"4dc8c937-0b9b-461a-be1b-02bdb587b70e","Type":"ContainerDied","Data":"237eb45884084c45016b5f46a907fcfa367aa8dbef9b4eddbe88c4430a23713e"} Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.936343 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" event={"ID":"4dc8c937-0b9b-461a-be1b-02bdb587b70e","Type":"ContainerStarted","Data":"5155295f42a65af746e2fad57322cd5d5a1ab40f98230c024d2700e401492665"} Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.952119 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g8b4r"] Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.972392 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxrm5" event={"ID":"250e18d9-cb14-4309-8d0c-fb341511dba6","Type":"ContainerStarted","Data":"17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc"} Feb 02 10:52:27 crc kubenswrapper[4845]: I0202 10:52:27.979701 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-cpdt4"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.078384 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" event={"ID":"cc6155c0-72ff-4f97-9748-716e3df8ad88","Type":"ContainerDied","Data":"4cc1cd6ad5cc574e57eca193db9f213196ac2057c994f0da0d19824fee8e97c1"} Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.078445 4845 scope.go:117] "RemoveContainer" containerID="2d2c3127f10566f1a93e004f0e822b849cbd5d19678d12b2cc9041ee23459162" Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.079740 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-g5r95" Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.138613 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gxjbc"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.165543 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-glpvf" podStartSLOduration=4.165517253 podStartE2EDuration="4.165517253s" podCreationTimestamp="2026-02-02 10:52:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:28.028632035 +0000 UTC m=+1229.120033485" watchObservedRunningTime="2026-02-02 10:52:28.165517253 +0000 UTC m=+1229.256918703" Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.174417 4845 scope.go:117] "RemoveContainer" containerID="3f008994368901472bc0803e0e3411d898bba47e6f98a3c829832d620028583b" Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.308740 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.382013 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.407572 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.462258 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-g5r95"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.577942 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-hft5g"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.621912 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.686818 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:28 crc kubenswrapper[4845]: I0202 10:52:28.964112 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.116951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cpdt4" event={"ID":"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2","Type":"ContainerStarted","Data":"eaea8c79134a763eb67941fd983f9ce0e269d99bbc22b44421da606e30805a94"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.116999 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cpdt4" event={"ID":"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2","Type":"ContainerStarted","Data":"910eea5be92e0d78d64ac4574401cd26e5b94f8c77cd06cf2a9e9a5f781e5430"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.123990 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerStarted","Data":"320ae8c7183d27e383573443ad820bda2273505766b721abc86327e40604e0ca"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.124974 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.127761 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerStarted","Data":"3a2962ac3acf70d6e69f1fadfc2545a5d2cf6a481bf3dd186493c288796a95b6"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.142587 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g8b4r" event={"ID":"183b0ef9-490f-43a1-a464-2bd64a820ebd","Type":"ContainerStarted","Data":"8487cd80461a5551ea17adb0f75cb6e4ce51ee5bd0eda70e468bd0162117e3f9"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.151119 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gxjbc" event={"ID":"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b","Type":"ContainerStarted","Data":"f5152a46482c33a388878313470bd17796ec05812413482c82f7e14ccb92881e"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.155678 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-cpdt4" podStartSLOduration=4.155651884 podStartE2EDuration="4.155651884s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:29.133876472 +0000 UTC m=+1230.225277932" watchObservedRunningTime="2026-02-02 10:52:29.155651884 +0000 UTC m=+1230.247053334" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.159018 4845 generic.go:334] "Generic (PLEG): container finished" podID="08802fb3-9897-4819-a38b-fe13e8892b47" containerID="ad0d9ec1ecba0e033c137223df62f5b330d1ec35e21a3f0e070e5061487e39e2" exitCode=0 Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.159128 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" event={"ID":"08802fb3-9897-4819-a38b-fe13e8892b47","Type":"ContainerDied","Data":"ad0d9ec1ecba0e033c137223df62f5b330d1ec35e21a3f0e070e5061487e39e2"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.159176 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" event={"ID":"08802fb3-9897-4819-a38b-fe13e8892b47","Type":"ContainerStarted","Data":"990f2a82fd916ae58e607d4d041c7e0245ec85e42bc4953b989f05218c6f9e19"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.187845 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hft5g" event={"ID":"9868fb5b-b18e-42b0-8532-6e6a55da71d2","Type":"ContainerStarted","Data":"303b261fa422f333b504b71646d5d15f21982e8b9a4a8cba35c7d99137363acc"} Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.258410 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.258873 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgg4z\" (UniqueName: \"kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.258937 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.258986 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.259056 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.259144 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.266608 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z" (OuterVolumeSpecName: "kube-api-access-wgg4z") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "kube-api-access-wgg4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.273650 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.304396 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.312514 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config" (OuterVolumeSpecName: "config") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.321067 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.325444 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.365159 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.366452 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") pod \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\" (UID: \"4dc8c937-0b9b-461a-be1b-02bdb587b70e\") " Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.367334 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgg4z\" (UniqueName: \"kubernetes.io/projected/4dc8c937-0b9b-461a-be1b-02bdb587b70e-kube-api-access-wgg4z\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.367818 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.367913 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.367994 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.368054 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: W0202 10:52:29.368390 4845 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4dc8c937-0b9b-461a-be1b-02bdb587b70e/volumes/kubernetes.io~configmap/ovsdbserver-nb Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.369052 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4dc8c937-0b9b-461a-be1b-02bdb587b70e" (UID: "4dc8c937-0b9b-461a-be1b-02bdb587b70e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.471226 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4dc8c937-0b9b-461a-be1b-02bdb587b70e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:29 crc kubenswrapper[4845]: I0202 10:52:29.737776 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" path="/var/lib/kubelet/pods/cc6155c0-72ff-4f97-9748-716e3df8ad88/volumes" Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.209714 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerStarted","Data":"754cb2f68170ee366688fac6af09583225928231cbd2adb4266116caffa237e6"} Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.212061 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" event={"ID":"4dc8c937-0b9b-461a-be1b-02bdb587b70e","Type":"ContainerDied","Data":"5155295f42a65af746e2fad57322cd5d5a1ab40f98230c024d2700e401492665"} Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.212098 4845 scope.go:117] "RemoveContainer" containerID="237eb45884084c45016b5f46a907fcfa367aa8dbef9b4eddbe88c4430a23713e" Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.212195 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-v7wns" Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.215074 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerStarted","Data":"9a806cef3c6476b0a1f1311cf8436474892e88fa49f5a547d7d4cfbbecc99d66"} Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.259142 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" event={"ID":"08802fb3-9897-4819-a38b-fe13e8892b47","Type":"ContainerStarted","Data":"3ed1c7658c68f0c8c4a3c24a82346d038eae78839b07b31f8eefe48619b71f5d"} Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.259244 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.323931 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" podStartSLOduration=5.323864119 podStartE2EDuration="5.323864119s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:30.286964426 +0000 UTC m=+1231.378365886" watchObservedRunningTime="2026-02-02 10:52:30.323864119 +0000 UTC m=+1231.415265569" Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.466591 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:30 crc kubenswrapper[4845]: I0202 10:52:30.473133 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-v7wns"] Feb 02 10:52:31 crc kubenswrapper[4845]: I0202 10:52:31.330873 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerStarted","Data":"5afe1239bc585e0bef47e3d113767ef48eca1fb7a34ae2876787a3e542d61760"} Feb 02 10:52:31 crc kubenswrapper[4845]: I0202 10:52:31.726557 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dc8c937-0b9b-461a-be1b-02bdb587b70e" path="/var/lib/kubelet/pods/4dc8c937-0b9b-461a-be1b-02bdb587b70e/volumes" Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.348926 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerStarted","Data":"bd78b5513d3cbaeab95964442b610e53834c55a502d340e969dbf735d6641a12"} Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.348999 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-log" containerID="cri-o://5afe1239bc585e0bef47e3d113767ef48eca1fb7a34ae2876787a3e542d61760" gracePeriod=30 Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.349240 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-httpd" containerID="cri-o://bd78b5513d3cbaeab95964442b610e53834c55a502d340e969dbf735d6641a12" gracePeriod=30 Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.363176 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerStarted","Data":"27b4758631a6299ed33bc9ad77808ae89e6b4adea1f9c02dc02478e0b78913fa"} Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.363192 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-log" containerID="cri-o://754cb2f68170ee366688fac6af09583225928231cbd2adb4266116caffa237e6" gracePeriod=30 Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.363252 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-httpd" containerID="cri-o://27b4758631a6299ed33bc9ad77808ae89e6b4adea1f9c02dc02478e0b78913fa" gracePeriod=30 Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.402836 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.402807718 podStartE2EDuration="7.402807718s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:32.383470555 +0000 UTC m=+1233.474872005" watchObservedRunningTime="2026-02-02 10:52:32.402807718 +0000 UTC m=+1233.494209168" Feb 02 10:52:32 crc kubenswrapper[4845]: I0202 10:52:32.428547 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.428521372 podStartE2EDuration="7.428521372s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:32.417926689 +0000 UTC m=+1233.509328149" watchObservedRunningTime="2026-02-02 10:52:32.428521372 +0000 UTC m=+1233.519922822" Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.057097 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.062266 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.395121 4845 generic.go:334] "Generic (PLEG): container finished" podID="f7543ae2-53fd-42d7-971f-a09923f10187" containerID="bd78b5513d3cbaeab95964442b610e53834c55a502d340e969dbf735d6641a12" exitCode=0 Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.395160 4845 generic.go:334] "Generic (PLEG): container finished" podID="f7543ae2-53fd-42d7-971f-a09923f10187" containerID="5afe1239bc585e0bef47e3d113767ef48eca1fb7a34ae2876787a3e542d61760" exitCode=143 Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.395208 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerDied","Data":"bd78b5513d3cbaeab95964442b610e53834c55a502d340e969dbf735d6641a12"} Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.395280 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerDied","Data":"5afe1239bc585e0bef47e3d113767ef48eca1fb7a34ae2876787a3e542d61760"} Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.401378 4845 generic.go:334] "Generic (PLEG): container finished" podID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerID="27b4758631a6299ed33bc9ad77808ae89e6b4adea1f9c02dc02478e0b78913fa" exitCode=0 Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.401440 4845 generic.go:334] "Generic (PLEG): container finished" podID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerID="754cb2f68170ee366688fac6af09583225928231cbd2adb4266116caffa237e6" exitCode=143 Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.401468 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerDied","Data":"27b4758631a6299ed33bc9ad77808ae89e6b4adea1f9c02dc02478e0b78913fa"} Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.401526 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerDied","Data":"754cb2f68170ee366688fac6af09583225928231cbd2adb4266116caffa237e6"} Feb 02 10:52:33 crc kubenswrapper[4845]: I0202 10:52:33.407998 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 10:52:34 crc kubenswrapper[4845]: I0202 10:52:34.416281 4845 generic.go:334] "Generic (PLEG): container finished" podID="250ffbd9-33d6-4a0d-b812-1d092341d4f9" containerID="cc9a388001d07f3511088bc5b867a073370c23dfc566a76ed6b25957f4cd9611" exitCode=0 Feb 02 10:52:34 crc kubenswrapper[4845]: I0202 10:52:34.416355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-glpvf" event={"ID":"250ffbd9-33d6-4a0d-b812-1d092341d4f9","Type":"ContainerDied","Data":"cc9a388001d07f3511088bc5b867a073370c23dfc566a76ed6b25957f4cd9611"} Feb 02 10:52:36 crc kubenswrapper[4845]: I0202 10:52:36.251732 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:52:36 crc kubenswrapper[4845]: I0202 10:52:36.317243 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:52:36 crc kubenswrapper[4845]: I0202 10:52:36.317489 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-7lh98" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" containerID="cri-o://7b934d9dc539ae78cc75a18c8a04ead9cef0ee49e2411933655a99884f524af5" gracePeriod=10 Feb 02 10:52:36 crc kubenswrapper[4845]: I0202 10:52:36.460730 4845 generic.go:334] "Generic (PLEG): container finished" podID="125bfda8-e971-4249-8b07-0bbff61e4725" containerID="7b934d9dc539ae78cc75a18c8a04ead9cef0ee49e2411933655a99884f524af5" exitCode=0 Feb 02 10:52:36 crc kubenswrapper[4845]: I0202 10:52:36.460804 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7lh98" event={"ID":"125bfda8-e971-4249-8b07-0bbff61e4725","Type":"ContainerDied","Data":"7b934d9dc539ae78cc75a18c8a04ead9cef0ee49e2411933655a99884f524af5"} Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.288390 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.303780 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.466769 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sgx4\" (UniqueName: \"kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.466839 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.466909 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.466938 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467017 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467080 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467267 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467305 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467328 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxcf6\" (UniqueName: \"kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467569 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467681 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467815 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467846 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys\") pod \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\" (UID: \"250ffbd9-33d6-4a0d-b812-1d092341d4f9\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.467998 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.468054 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts\") pod \"f7543ae2-53fd-42d7-971f-a09923f10187\" (UID: \"f7543ae2-53fd-42d7-971f-a09923f10187\") " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.468290 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs" (OuterVolumeSpecName: "logs") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.469006 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.469026 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f7543ae2-53fd-42d7-971f-a09923f10187-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.490002 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts" (OuterVolumeSpecName: "scripts") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.490139 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.491930 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts" (OuterVolumeSpecName: "scripts") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.494171 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6" (OuterVolumeSpecName: "kube-api-access-xxcf6") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "kube-api-access-xxcf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.494660 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4" (OuterVolumeSpecName: "kube-api-access-9sgx4") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "kube-api-access-9sgx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.496232 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995" (OuterVolumeSpecName: "glance") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.506762 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.534219 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.539493 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data" (OuterVolumeSpecName: "config-data") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.547159 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.548075 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f7543ae2-53fd-42d7-971f-a09923f10187","Type":"ContainerDied","Data":"9a806cef3c6476b0a1f1311cf8436474892e88fa49f5a547d7d4cfbbecc99d66"} Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.548139 4845 scope.go:117] "RemoveContainer" containerID="bd78b5513d3cbaeab95964442b610e53834c55a502d340e969dbf735d6641a12" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.562153 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-glpvf" event={"ID":"250ffbd9-33d6-4a0d-b812-1d092341d4f9","Type":"ContainerDied","Data":"de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921"} Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.562191 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-glpvf" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.562221 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3e03d863b0420869eef2e3a7c17fb4d5f1b982bfb909ef8098e97f81653921" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.575034 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585258 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585358 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sgx4\" (UniqueName: \"kubernetes.io/projected/250ffbd9-33d6-4a0d-b812-1d092341d4f9-kube-api-access-9sgx4\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585437 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585542 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") on node \"crc\" " Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585630 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxcf6\" (UniqueName: \"kubernetes.io/projected/f7543ae2-53fd-42d7-971f-a09923f10187-kube-api-access-xxcf6\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585727 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585810 4845 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.585924 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.604828 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "250ffbd9-33d6-4a0d-b812-1d092341d4f9" (UID: "250ffbd9-33d6-4a0d-b812-1d092341d4f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.607703 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data" (OuterVolumeSpecName: "config-data") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.623691 4845 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.623807 4845 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995") on node "crc" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.624722 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f7543ae2-53fd-42d7-971f-a09923f10187" (UID: "f7543ae2-53fd-42d7-971f-a09923f10187"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.688598 4845 reconciler_common.go:293] "Volume detached for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.688645 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250ffbd9-33d6-4a0d-b812-1d092341d4f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.688659 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.688671 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7543ae2-53fd-42d7-971f-a09923f10187-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.763221 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7lh98" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.874172 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.889932 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.925512 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926426 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-httpd" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926443 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-httpd" Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926482 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="dnsmasq-dns" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926490 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="dnsmasq-dns" Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926505 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250ffbd9-33d6-4a0d-b812-1d092341d4f9" containerName="keystone-bootstrap" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926511 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="250ffbd9-33d6-4a0d-b812-1d092341d4f9" containerName="keystone-bootstrap" Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926530 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dc8c937-0b9b-461a-be1b-02bdb587b70e" containerName="init" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926536 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dc8c937-0b9b-461a-be1b-02bdb587b70e" containerName="init" Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926557 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-log" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926564 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-log" Feb 02 10:52:39 crc kubenswrapper[4845]: E0202 10:52:39.926577 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="init" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926582 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="init" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.926997 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dc8c937-0b9b-461a-be1b-02bdb587b70e" containerName="init" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.927022 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="250ffbd9-33d6-4a0d-b812-1d092341d4f9" containerName="keystone-bootstrap" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.927036 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-log" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.927056 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6155c0-72ff-4f97-9748-716e3df8ad88" containerName="dnsmasq-dns" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.927079 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" containerName="glance-httpd" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.929973 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.934653 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.938228 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:52:39 crc kubenswrapper[4845]: I0202 10:52:39.939219 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.098910 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.099258 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.099400 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.099506 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.099616 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7xq\" (UniqueName: \"kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.100168 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.100722 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.101166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203266 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203652 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203679 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203726 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203748 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7xq\" (UniqueName: \"kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203800 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203825 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.203847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.204591 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.205240 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.212647 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.212699 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a65ad887e305c92ac1c8235cc9c5fc327f1ea7ce91b9974356e11ee00bc2f81/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.216046 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.216137 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.217275 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.217086 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.229900 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7xq\" (UniqueName: \"kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.278217 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.440118 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-glpvf"] Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.454600 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-glpvf"] Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.562613 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.563188 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-f7js4"] Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.564627 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.570697 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.571017 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.571110 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.571270 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4r6h5" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.572250 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.584590 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f7js4"] Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717135 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717227 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717302 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4cxd\" (UniqueName: \"kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717415 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.717446 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.819645 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.819736 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.819909 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.819989 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.820028 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4cxd\" (UniqueName: \"kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.820090 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.824431 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.824719 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.824921 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.824961 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.828868 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.856410 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4cxd\" (UniqueName: \"kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd\") pod \"keystone-bootstrap-f7js4\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:40 crc kubenswrapper[4845]: I0202 10:52:40.919685 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:52:41 crc kubenswrapper[4845]: I0202 10:52:41.725381 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250ffbd9-33d6-4a0d-b812-1d092341d4f9" path="/var/lib/kubelet/pods/250ffbd9-33d6-4a0d-b812-1d092341d4f9/volumes" Feb 02 10:52:41 crc kubenswrapper[4845]: I0202 10:52:41.726763 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7543ae2-53fd-42d7-971f-a09923f10187" path="/var/lib/kubelet/pods/f7543ae2-53fd-42d7-971f-a09923f10187/volumes" Feb 02 10:52:44 crc kubenswrapper[4845]: I0202 10:52:44.762792 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7lh98" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Feb 02 10:52:45 crc kubenswrapper[4845]: E0202 10:52:45.001007 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 02 10:52:45 crc kubenswrapper[4845]: E0202 10:52:45.001430 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjfj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-hft5g_openstack(9868fb5b-b18e-42b0-8532-6e6a55da71d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:52:45 crc kubenswrapper[4845]: E0202 10:52:45.002594 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-hft5g" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" Feb 02 10:52:45 crc kubenswrapper[4845]: E0202 10:52:45.656365 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-hft5g" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" Feb 02 10:52:47 crc kubenswrapper[4845]: E0202 10:52:47.088202 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 02 10:52:47 crc kubenswrapper[4845]: E0202 10:52:47.088514 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d4h67bh5b9h5fh688h565hb4h685h5d4h5cbh65fh644h54bh79hbdh59dh74h666h56bh7fh584h688hddh565h646h5fbh5h78h596hf7h656hcdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4g7v5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ba91fb37-4550-4684-99bb-45dba169a879): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.219045 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.304526 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4hxp\" (UniqueName: \"kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305108 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305206 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305228 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305266 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305294 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305345 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.305458 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\" (UID: \"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a\") " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.306125 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.306141 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs" (OuterVolumeSpecName: "logs") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.307625 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.307675 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.315290 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts" (OuterVolumeSpecName: "scripts") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.323081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp" (OuterVolumeSpecName: "kube-api-access-d4hxp") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "kube-api-access-d4hxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.343504 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7" (OuterVolumeSpecName: "glance") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "pvc-a16f116c-8f63-4ae9-a645-587add90fda7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.387088 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.393268 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.402095 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data" (OuterVolumeSpecName: "config-data") pod "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" (UID: "07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410278 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") on node \"crc\" " Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410324 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4hxp\" (UniqueName: \"kubernetes.io/projected/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-kube-api-access-d4hxp\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410337 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410347 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410356 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.410364 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.447623 4845 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.447810 4845 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a16f116c-8f63-4ae9-a645-587add90fda7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7") on node "crc" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.512683 4845 reconciler_common.go:293] "Volume detached for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.678084 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a","Type":"ContainerDied","Data":"3a2962ac3acf70d6e69f1fadfc2545a5d2cf6a481bf3dd186493c288796a95b6"} Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.678136 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.744401 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.744441 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.761336 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:47 crc kubenswrapper[4845]: E0202 10:52:47.761898 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-log" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.761911 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-log" Feb 02 10:52:47 crc kubenswrapper[4845]: E0202 10:52:47.761963 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-httpd" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.761972 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-httpd" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.762275 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-httpd" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.762306 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" containerName="glance-log" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.769707 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.773202 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.773235 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.782147 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.921356 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.921767 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.921850 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.922076 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.922168 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzv2r\" (UniqueName: \"kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.922418 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.922455 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4845]: I0202 10:52:47.922513 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024580 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024614 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024721 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024763 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzv2r\" (UniqueName: \"kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024801 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.024831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.025129 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.025187 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.027570 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.027606 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ed60d0e4f5ad2fb51e67cadb4519054184ad51c31b402d173121c9411d32387/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.029917 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.030256 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.031724 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.036823 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.044241 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzv2r\" (UniqueName: \"kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.079676 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:48 crc kubenswrapper[4845]: I0202 10:52:48.094987 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:49 crc kubenswrapper[4845]: I0202 10:52:49.728131 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a" path="/var/lib/kubelet/pods/07c692d7-14b8-42ef-8f5b-9c88fbfd5d6a/volumes" Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.627242 4845 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode0a38136-159d-482a-988e-07f3b77fdbb4"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode0a38136-159d-482a-988e-07f3b77fdbb4] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0a38136_159d_482a_988e_07f3b77fdbb4.slice" Feb 02 10:52:51 crc kubenswrapper[4845]: E0202 10:52:51.627776 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pode0a38136-159d-482a-988e-07f3b77fdbb4] : unable to destroy cgroup paths for cgroup [kubepods besteffort pode0a38136-159d-482a-988e-07f3b77fdbb4] : Timed out while waiting for systemd to remove kubepods-besteffort-pode0a38136_159d_482a_988e_07f3b77fdbb4.slice" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" podUID="e0a38136-159d-482a-988e-07f3b77fdbb4" Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.723732 4845 generic.go:334] "Generic (PLEG): container finished" podID="856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" containerID="eaea8c79134a763eb67941fd983f9ce0e269d99bbc22b44421da606e30805a94" exitCode=0 Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.723859 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-g827z" Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.724647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cpdt4" event={"ID":"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2","Type":"ContainerDied","Data":"eaea8c79134a763eb67941fd983f9ce0e269d99bbc22b44421da606e30805a94"} Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.851078 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:51 crc kubenswrapper[4845]: I0202 10:52:51.873315 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-g827z"] Feb 02 10:52:53 crc kubenswrapper[4845]: I0202 10:52:53.727477 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a38136-159d-482a-988e-07f3b77fdbb4" path="/var/lib/kubelet/pods/e0a38136-159d-482a-988e-07f3b77fdbb4/volumes" Feb 02 10:52:54 crc kubenswrapper[4845]: I0202 10:52:54.763065 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7lh98" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Feb 02 10:52:54 crc kubenswrapper[4845]: I0202 10:52:54.763550 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.265784 4845 scope.go:117] "RemoveContainer" containerID="5afe1239bc585e0bef47e3d113767ef48eca1fb7a34ae2876787a3e542d61760" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.425636 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.434099 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542280 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle\") pod \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542353 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb\") pod \"125bfda8-e971-4249-8b07-0bbff61e4725\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542589 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config\") pod \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542633 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zllkd\" (UniqueName: \"kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd\") pod \"125bfda8-e971-4249-8b07-0bbff61e4725\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542729 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mpk6\" (UniqueName: \"kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6\") pod \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\" (UID: \"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542820 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb\") pod \"125bfda8-e971-4249-8b07-0bbff61e4725\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542872 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config\") pod \"125bfda8-e971-4249-8b07-0bbff61e4725\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.542917 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc\") pod \"125bfda8-e971-4249-8b07-0bbff61e4725\" (UID: \"125bfda8-e971-4249-8b07-0bbff61e4725\") " Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.548801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6" (OuterVolumeSpecName: "kube-api-access-9mpk6") pod "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" (UID: "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2"). InnerVolumeSpecName "kube-api-access-9mpk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.549365 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd" (OuterVolumeSpecName: "kube-api-access-zllkd") pod "125bfda8-e971-4249-8b07-0bbff61e4725" (UID: "125bfda8-e971-4249-8b07-0bbff61e4725"). InnerVolumeSpecName "kube-api-access-zllkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.585914 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config" (OuterVolumeSpecName: "config") pod "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" (UID: "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.591335 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" (UID: "856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.608659 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config" (OuterVolumeSpecName: "config") pod "125bfda8-e971-4249-8b07-0bbff61e4725" (UID: "125bfda8-e971-4249-8b07-0bbff61e4725"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.611266 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "125bfda8-e971-4249-8b07-0bbff61e4725" (UID: "125bfda8-e971-4249-8b07-0bbff61e4725"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.618063 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "125bfda8-e971-4249-8b07-0bbff61e4725" (UID: "125bfda8-e971-4249-8b07-0bbff61e4725"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.628073 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "125bfda8-e971-4249-8b07-0bbff61e4725" (UID: "125bfda8-e971-4249-8b07-0bbff61e4725"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646568 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mpk6\" (UniqueName: \"kubernetes.io/projected/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-kube-api-access-9mpk6\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646602 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646614 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646623 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646634 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646643 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/125bfda8-e971-4249-8b07-0bbff61e4725-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646652 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.646663 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zllkd\" (UniqueName: \"kubernetes.io/projected/125bfda8-e971-4249-8b07-0bbff61e4725-kube-api-access-zllkd\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.773684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-cpdt4" event={"ID":"856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2","Type":"ContainerDied","Data":"910eea5be92e0d78d64ac4574401cd26e5b94f8c77cd06cf2a9e9a5f781e5430"} Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.773725 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="910eea5be92e0d78d64ac4574401cd26e5b94f8c77cd06cf2a9e9a5f781e5430" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.773777 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-cpdt4" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.781873 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-7lh98" event={"ID":"125bfda8-e971-4249-8b07-0bbff61e4725","Type":"ContainerDied","Data":"d8710db5f1971bcb1ada6e2682b3528a8c529ad636b2e603fac42dddaaffa6b0"} Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.781946 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-7lh98" Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.814611 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:52:55 crc kubenswrapper[4845]: I0202 10:52:55.833244 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-7lh98"] Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.733742 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:52:56 crc kubenswrapper[4845]: E0202 10:52:56.734304 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.734328 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" Feb 02 10:52:56 crc kubenswrapper[4845]: E0202 10:52:56.734359 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="init" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.734369 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="init" Feb 02 10:52:56 crc kubenswrapper[4845]: E0202 10:52:56.734391 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" containerName="neutron-db-sync" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.734399 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" containerName="neutron-db-sync" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.734641 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.734688 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" containerName="neutron-db-sync" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.737122 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.762354 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.880866 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.880955 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.880977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.881018 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tnwb\" (UniqueName: \"kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.881047 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.881166 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.938701 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.941985 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.944843 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.945270 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.945461 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tjnkn" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.945776 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.956576 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983614 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983634 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983679 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tnwb\" (UniqueName: \"kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983756 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.983860 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.984738 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.985239 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.985761 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.986311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:56 crc kubenswrapper[4845]: I0202 10:52:56.986382 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.017346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tnwb\" (UniqueName: \"kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb\") pod \"dnsmasq-dns-55f844cf75-mrvw8\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.078756 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.086123 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9m4\" (UniqueName: \"kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.086305 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.086355 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.086458 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.086496 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.188534 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9m4\" (UniqueName: \"kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.189068 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.189094 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.189212 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.189287 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.193930 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.204655 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.206575 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9m4\" (UniqueName: \"kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.206618 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.221493 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config\") pod \"neutron-7c658d9d4-mvn9b\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.261945 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.735004 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" path="/var/lib/kubelet/pods/125bfda8-e971-4249-8b07-0bbff61e4725/volumes" Feb 02 10:52:57 crc kubenswrapper[4845]: E0202 10:52:57.765031 4845 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 02 10:52:57 crc kubenswrapper[4845]: E0202 10:52:57.765226 4845 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rflw4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-g8b4r_openstack(183b0ef9-490f-43a1-a464-2bd64a820ebd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:52:57 crc kubenswrapper[4845]: E0202 10:52:57.767736 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-g8b4r" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" Feb 02 10:52:57 crc kubenswrapper[4845]: I0202 10:52:57.778012 4845 scope.go:117] "RemoveContainer" containerID="27b4758631a6299ed33bc9ad77808ae89e6b4adea1f9c02dc02478e0b78913fa" Feb 02 10:52:57 crc kubenswrapper[4845]: E0202 10:52:57.834095 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-g8b4r" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" Feb 02 10:52:58 crc kubenswrapper[4845]: I0202 10:52:58.314382 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:58 crc kubenswrapper[4845]: W0202 10:52:58.441817 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f69079e_af81_421c_870a_2a08c1b2420e.slice/crio-4d7f9cbfa39969e34e788860572765e355857e65be648d766beb851dfaf208a7 WatchSource:0}: Error finding container 4d7f9cbfa39969e34e788860572765e355857e65be648d766beb851dfaf208a7: Status 404 returned error can't find the container with id 4d7f9cbfa39969e34e788860572765e355857e65be648d766beb851dfaf208a7 Feb 02 10:52:58 crc kubenswrapper[4845]: I0202 10:52:58.482526 4845 scope.go:117] "RemoveContainer" containerID="754cb2f68170ee366688fac6af09583225928231cbd2adb4266116caffa237e6" Feb 02 10:52:58 crc kubenswrapper[4845]: I0202 10:52:58.711960 4845 scope.go:117] "RemoveContainer" containerID="7b934d9dc539ae78cc75a18c8a04ead9cef0ee49e2411933655a99884f524af5" Feb 02 10:52:58 crc kubenswrapper[4845]: I0202 10:52:58.870898 4845 scope.go:117] "RemoveContainer" containerID="ddd6909bbdf16b8af6fed8c71335e6bc1892bf152856011ceb433a2a497011e2" Feb 02 10:52:58 crc kubenswrapper[4845]: I0202 10:52:58.888502 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerStarted","Data":"4d7f9cbfa39969e34e788860572765e355857e65be648d766beb851dfaf208a7"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.004824 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.006774 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.012739 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.013113 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.039170 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146052 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146446 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146491 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146542 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146587 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146629 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.146704 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9ml\" (UniqueName: \"kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.236380 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295502 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295545 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295611 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295672 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295725 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.295833 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9ml\" (UniqueName: \"kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.314363 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.324449 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.328952 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.329032 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-f7js4"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.329344 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.338154 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9ml\" (UniqueName: \"kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.338967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.343646 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs\") pod \"neutron-6898599c95-65qmn\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.353238 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.425394 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.569548 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.764040 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-7lh98" podUID="125bfda8-e971-4249-8b07-0bbff61e4725" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: i/o timeout" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.916745 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f7js4" event={"ID":"ae3aa591-f1f0-4264-a970-d8172cc24781","Type":"ContainerStarted","Data":"055042cb2d15b6eebd454cdc9f356a4e981de86dda6b046eef4e17f7f79f827f"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.917111 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f7js4" event={"ID":"ae3aa591-f1f0-4264-a970-d8172cc24781","Type":"ContainerStarted","Data":"a92252ab7199355c9918d53c338f871f76fb1b5cb5bdfbecf7537a0811f888ee"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.926505 4845 generic.go:334] "Generic (PLEG): container finished" podID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerID="720ce281ce34166ec16d662e6b2cfb3e73984fcbc45102640b4bba9183262c78" exitCode=0 Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.926594 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" event={"ID":"33b0a7fa-f66e-470e-95a3-a110ecec168b","Type":"ContainerDied","Data":"720ce281ce34166ec16d662e6b2cfb3e73984fcbc45102640b4bba9183262c78"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.926624 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" event={"ID":"33b0a7fa-f66e-470e-95a3-a110ecec168b","Type":"ContainerStarted","Data":"24bfd21204b757a912a5fde09fd51c09b7d4dece5bb10e7fce06a7581904b6df"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.942560 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-f7js4" podStartSLOduration=19.942537954 podStartE2EDuration="19.942537954s" podCreationTimestamp="2026-02-02 10:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:59.940356632 +0000 UTC m=+1261.031758082" watchObservedRunningTime="2026-02-02 10:52:59.942537954 +0000 UTC m=+1261.033939404" Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.947353 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerStarted","Data":"4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.951911 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerStarted","Data":"4c795b185512cbaa08a41087a290ab376088c31c6277e1a8f0ee3f21dc22200f"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.959050 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxrm5" event={"ID":"250e18d9-cb14-4309-8d0c-fb341511dba6","Type":"ContainerStarted","Data":"bf4552b15f381b58f4ac832ee06ab76b9eb11fbeaace10aae25c2f5b9bfbac69"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.978093 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerStarted","Data":"046426bd44987200c7af4b3e2c8d25a4f244d0fd82b246a7123cbf79584728b3"} Feb 02 10:52:59 crc kubenswrapper[4845]: I0202 10:52:59.993578 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-kxrm5" podStartSLOduration=4.531212257 podStartE2EDuration="34.993557811s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="2026-02-02 10:52:27.323146892 +0000 UTC m=+1228.414548342" lastFinishedPulling="2026-02-02 10:52:57.785492446 +0000 UTC m=+1258.876893896" observedRunningTime="2026-02-02 10:52:59.981316171 +0000 UTC m=+1261.072717621" watchObservedRunningTime="2026-02-02 10:52:59.993557811 +0000 UTC m=+1261.084959261" Feb 02 10:53:00 crc kubenswrapper[4845]: I0202 10:53:00.029247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerStarted","Data":"e0403804c19c7d887fabdb04dd94aa27e4ca3026135843fda8f93cf69fb0c8cd"} Feb 02 10:53:00 crc kubenswrapper[4845]: I0202 10:53:00.070711 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gxjbc" event={"ID":"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b","Type":"ContainerStarted","Data":"e50f62cc9707edb269ffe6698207592dfbc48d0ade6ff635de9614b2c3d62a34"} Feb 02 10:53:00 crc kubenswrapper[4845]: I0202 10:53:00.089614 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gxjbc" podStartSLOduration=5.458127102 podStartE2EDuration="35.089595833s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="2026-02-02 10:52:28.078615502 +0000 UTC m=+1229.170016952" lastFinishedPulling="2026-02-02 10:52:57.710084233 +0000 UTC m=+1258.801485683" observedRunningTime="2026-02-02 10:53:00.088463851 +0000 UTC m=+1261.179865311" watchObservedRunningTime="2026-02-02 10:53:00.089595833 +0000 UTC m=+1261.180997273" Feb 02 10:53:00 crc kubenswrapper[4845]: I0202 10:53:00.117620 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:53:00 crc kubenswrapper[4845]: W0202 10:53:00.149606 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d345372_d7c4_4094_b9cb_e2afbd2dbf54.slice/crio-00ea85117a3a7a613efb0ee0b197731f8faf53d379b635789e63fb18dc175257 WatchSource:0}: Error finding container 00ea85117a3a7a613efb0ee0b197731f8faf53d379b635789e63fb18dc175257: Status 404 returned error can't find the container with id 00ea85117a3a7a613efb0ee0b197731f8faf53d379b635789e63fb18dc175257 Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.086878 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerStarted","Data":"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.088869 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.089128 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerStarted","Data":"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.089151 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerStarted","Data":"00ea85117a3a7a613efb0ee0b197731f8faf53d379b635789e63fb18dc175257"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.092944 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerStarted","Data":"1e24bbe2d8cd0583fc986cf6fd412f527daea58b4a85706eb389322bf6ad3af7"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.109423 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerStarted","Data":"0318987d82017a4372eee65da6e584e622e1a0531f87350923bd3ea8ad37c0e3"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.109480 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerStarted","Data":"2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.110728 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.118836 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6898599c95-65qmn" podStartSLOduration=3.11881324 podStartE2EDuration="3.11881324s" podCreationTimestamp="2026-02-02 10:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:01.105203381 +0000 UTC m=+1262.196604831" watchObservedRunningTime="2026-02-02 10:53:01.11881324 +0000 UTC m=+1262.210214690" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.128994 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" event={"ID":"33b0a7fa-f66e-470e-95a3-a110ecec168b","Type":"ContainerStarted","Data":"b19033b665f9424ff6ceb80860a3f821b9318c6ff803e1c8ca9ff4c4824c2a24"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.129059 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.148248 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerStarted","Data":"e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4"} Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.157147 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c658d9d4-mvn9b" podStartSLOduration=5.157123774 podStartE2EDuration="5.157123774s" podCreationTimestamp="2026-02-02 10:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:01.134456557 +0000 UTC m=+1262.225858007" watchObservedRunningTime="2026-02-02 10:53:01.157123774 +0000 UTC m=+1262.248525224" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.164767 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" podStartSLOduration=5.164743451 podStartE2EDuration="5.164743451s" podCreationTimestamp="2026-02-02 10:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:01.159424639 +0000 UTC m=+1262.250826089" watchObservedRunningTime="2026-02-02 10:53:01.164743451 +0000 UTC m=+1262.256144901" Feb 02 10:53:01 crc kubenswrapper[4845]: I0202 10:53:01.188348 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=22.188326525 podStartE2EDuration="22.188326525s" podCreationTimestamp="2026-02-02 10:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:01.185094972 +0000 UTC m=+1262.276496422" watchObservedRunningTime="2026-02-02 10:53:01.188326525 +0000 UTC m=+1262.279727975" Feb 02 10:53:02 crc kubenswrapper[4845]: I0202 10:53:02.164483 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerStarted","Data":"5c7e1e0f5ba6836be4b2cc0a23514474d1930ec71f3bf7b3e6b27bbccac7ee40"} Feb 02 10:53:02 crc kubenswrapper[4845]: I0202 10:53:02.199682 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.199660561 podStartE2EDuration="15.199660561s" podCreationTimestamp="2026-02-02 10:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:02.187098232 +0000 UTC m=+1263.278499692" watchObservedRunningTime="2026-02-02 10:53:02.199660561 +0000 UTC m=+1263.291062011" Feb 02 10:53:04 crc kubenswrapper[4845]: I0202 10:53:04.191622 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hft5g" event={"ID":"9868fb5b-b18e-42b0-8532-6e6a55da71d2","Type":"ContainerStarted","Data":"1ae234554f1bf061b9f12986d072427a2f50a26aa3b60b9c9cf12a5ccc0e8cce"} Feb 02 10:53:04 crc kubenswrapper[4845]: I0202 10:53:04.228082 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-hft5g" podStartSLOduration=6.099840694 podStartE2EDuration="39.228061575s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="2026-02-02 10:52:28.502477844 +0000 UTC m=+1229.593879294" lastFinishedPulling="2026-02-02 10:53:01.630698725 +0000 UTC m=+1262.722100175" observedRunningTime="2026-02-02 10:53:04.21560483 +0000 UTC m=+1265.307006280" watchObservedRunningTime="2026-02-02 10:53:04.228061575 +0000 UTC m=+1265.319463025" Feb 02 10:53:05 crc kubenswrapper[4845]: I0202 10:53:05.207722 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae3aa591-f1f0-4264-a970-d8172cc24781" containerID="055042cb2d15b6eebd454cdc9f356a4e981de86dda6b046eef4e17f7f79f827f" exitCode=0 Feb 02 10:53:05 crc kubenswrapper[4845]: I0202 10:53:05.207785 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f7js4" event={"ID":"ae3aa591-f1f0-4264-a970-d8172cc24781","Type":"ContainerDied","Data":"055042cb2d15b6eebd454cdc9f356a4e981de86dda6b046eef4e17f7f79f827f"} Feb 02 10:53:05 crc kubenswrapper[4845]: I0202 10:53:05.211548 4845 generic.go:334] "Generic (PLEG): container finished" podID="1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" containerID="e50f62cc9707edb269ffe6698207592dfbc48d0ade6ff635de9614b2c3d62a34" exitCode=0 Feb 02 10:53:05 crc kubenswrapper[4845]: I0202 10:53:05.211594 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gxjbc" event={"ID":"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b","Type":"ContainerDied","Data":"e50f62cc9707edb269ffe6698207592dfbc48d0ade6ff635de9614b2c3d62a34"} Feb 02 10:53:06 crc kubenswrapper[4845]: I0202 10:53:06.240656 4845 generic.go:334] "Generic (PLEG): container finished" podID="250e18d9-cb14-4309-8d0c-fb341511dba6" containerID="bf4552b15f381b58f4ac832ee06ab76b9eb11fbeaace10aae25c2f5b9bfbac69" exitCode=0 Feb 02 10:53:06 crc kubenswrapper[4845]: I0202 10:53:06.240751 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxrm5" event={"ID":"250e18d9-cb14-4309-8d0c-fb341511dba6","Type":"ContainerDied","Data":"bf4552b15f381b58f4ac832ee06ab76b9eb11fbeaace10aae25c2f5b9bfbac69"} Feb 02 10:53:06 crc kubenswrapper[4845]: I0202 10:53:06.966374 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:53:06 crc kubenswrapper[4845]: I0202 10:53:06.995994 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gxjbc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.037230 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn4fj\" (UniqueName: \"kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj\") pod \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.037599 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.037773 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs\") pod \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.038120 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.038357 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4cxd\" (UniqueName: \"kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.038516 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.038704 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle\") pod \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.038872 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.039281 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys\") pod \"ae3aa591-f1f0-4264-a970-d8172cc24781\" (UID: \"ae3aa591-f1f0-4264-a970-d8172cc24781\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.039616 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts\") pod \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.041520 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data\") pod \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\" (UID: \"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b\") " Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.041037 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs" (OuterVolumeSpecName: "logs") pod "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" (UID: "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.049642 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.055463 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts" (OuterVolumeSpecName: "scripts") pod "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" (UID: "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.055854 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts" (OuterVolumeSpecName: "scripts") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.055938 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.067437 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd" (OuterVolumeSpecName: "kube-api-access-v4cxd") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "kube-api-access-v4cxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.068120 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj" (OuterVolumeSpecName: "kube-api-access-vn4fj") pod "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" (UID: "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b"). InnerVolumeSpecName "kube-api-access-vn4fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.083442 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.107741 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.143767 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" (UID: "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.151932 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn4fj\" (UniqueName: \"kubernetes.io/projected/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-kube-api-access-vn4fj\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152282 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152588 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152601 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152612 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4cxd\" (UniqueName: \"kubernetes.io/projected/ae3aa591-f1f0-4264-a970-d8172cc24781-kube-api-access-v4cxd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152620 4845 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152629 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152637 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.152646 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.186518 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.186834 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="dnsmasq-dns" containerID="cri-o://3ed1c7658c68f0c8c4a3c24a82346d038eae78839b07b31f8eefe48619b71f5d" gracePeriod=10 Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.205488 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data" (OuterVolumeSpecName: "config-data") pod "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" (UID: "1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.208543 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data" (OuterVolumeSpecName: "config-data") pod "ae3aa591-f1f0-4264-a970-d8172cc24781" (UID: "ae3aa591-f1f0-4264-a970-d8172cc24781"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.254438 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.254470 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae3aa591-f1f0-4264-a970-d8172cc24781-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.256866 4845 generic.go:334] "Generic (PLEG): container finished" podID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" containerID="1ae234554f1bf061b9f12986d072427a2f50a26aa3b60b9c9cf12a5ccc0e8cce" exitCode=0 Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.257102 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hft5g" event={"ID":"9868fb5b-b18e-42b0-8532-6e6a55da71d2","Type":"ContainerDied","Data":"1ae234554f1bf061b9f12986d072427a2f50a26aa3b60b9c9cf12a5ccc0e8cce"} Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.262391 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-f7js4" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.262400 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-f7js4" event={"ID":"ae3aa591-f1f0-4264-a970-d8172cc24781","Type":"ContainerDied","Data":"a92252ab7199355c9918d53c338f871f76fb1b5cb5bdfbecf7537a0811f888ee"} Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.262735 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a92252ab7199355c9918d53c338f871f76fb1b5cb5bdfbecf7537a0811f888ee" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.280636 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gxjbc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.282002 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gxjbc" event={"ID":"1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b","Type":"ContainerDied","Data":"f5152a46482c33a388878313470bd17796ec05812413482c82f7e14ccb92881e"} Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.282038 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5152a46482c33a388878313470bd17796ec05812413482c82f7e14ccb92881e" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.409255 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c4f9db54b-5v9r8"] Feb 02 10:53:07 crc kubenswrapper[4845]: E0202 10:53:07.410095 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae3aa591-f1f0-4264-a970-d8172cc24781" containerName="keystone-bootstrap" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.410117 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae3aa591-f1f0-4264-a970-d8172cc24781" containerName="keystone-bootstrap" Feb 02 10:53:07 crc kubenswrapper[4845]: E0202 10:53:07.410147 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" containerName="placement-db-sync" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.410154 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" containerName="placement-db-sync" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.410344 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" containerName="placement-db-sync" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.410366 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae3aa591-f1f0-4264-a970-d8172cc24781" containerName="keystone-bootstrap" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.411211 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.419947 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.420013 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.420367 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4r6h5" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.420506 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.420576 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.420704 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.427566 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c4f9db54b-5v9r8"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460070 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-internal-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460124 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kb9b\" (UniqueName: \"kubernetes.io/projected/61e42051-311d-4b4b-af17-e301351d9267-kube-api-access-2kb9b\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460190 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-scripts\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460191 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460359 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-credential-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460393 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-fernet-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460471 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-config-data\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460745 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-public-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.460943 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-combined-ca-bundle\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.462660 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.474270 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.474482 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.474596 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.474708 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.475436 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kzmx2" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.550678 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.581965 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-config-data\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582067 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582097 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-public-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5b8\" (UniqueName: \"kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582470 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-combined-ca-bundle\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582511 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582647 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582847 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-internal-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582899 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kb9b\" (UniqueName: \"kubernetes.io/projected/61e42051-311d-4b4b-af17-e301351d9267-kube-api-access-2kb9b\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.582934 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.583014 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-scripts\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.583052 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.583103 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-credential-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.583134 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-fernet-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.594157 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-public-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.596590 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-combined-ca-bundle\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.597151 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-internal-tls-certs\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.599077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-scripts\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.599600 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-fernet-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.600226 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-config-data\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.603164 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/61e42051-311d-4b4b-af17-e301351d9267-credential-keys\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.612424 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kb9b\" (UniqueName: \"kubernetes.io/projected/61e42051-311d-4b4b-af17-e301351d9267-kube-api-access-2kb9b\") pod \"keystone-c4f9db54b-5v9r8\" (UID: \"61e42051-311d-4b4b-af17-e301351d9267\") " pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.687735 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.687843 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.688450 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.688481 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.688598 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5b8\" (UniqueName: \"kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.689037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.689848 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.695336 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.700400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.704401 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.706894 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.714542 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.717435 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.730058 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5b8\" (UniqueName: \"kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8\") pod \"placement-7fd6897c68-cspbg\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.781505 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.808952 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68f64c64d8-r7nkx"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.811569 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.811728 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.823250 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68f64c64d8-r7nkx"] Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903379 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-scripts\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903434 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhrlx\" (UniqueName: \"kubernetes.io/projected/5978920a-e63d-4cb3-accd-4353fb398d50-kube-api-access-zhrlx\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903485 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-internal-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903519 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-public-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903578 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-config-data\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903632 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5978920a-e63d-4cb3-accd-4353fb398d50-logs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:07 crc kubenswrapper[4845]: I0202 10:53:07.903671 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-combined-ca-bundle\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5978920a-e63d-4cb3-accd-4353fb398d50-logs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005343 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-combined-ca-bundle\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005388 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-scripts\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhrlx\" (UniqueName: \"kubernetes.io/projected/5978920a-e63d-4cb3-accd-4353fb398d50-kube-api-access-zhrlx\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005455 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-internal-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005489 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-public-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.005550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-config-data\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.006229 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5978920a-e63d-4cb3-accd-4353fb398d50-logs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.010079 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-combined-ca-bundle\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.011225 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-public-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.014576 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-internal-tls-certs\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.017701 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-scripts\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.020923 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5978920a-e63d-4cb3-accd-4353fb398d50-config-data\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.022192 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhrlx\" (UniqueName: \"kubernetes.io/projected/5978920a-e63d-4cb3-accd-4353fb398d50-kube-api-access-zhrlx\") pod \"placement-68f64c64d8-r7nkx\" (UID: \"5978920a-e63d-4cb3-accd-4353fb398d50\") " pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.031741 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.035323 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxrm5" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.095636 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.095914 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.107337 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data\") pod \"250e18d9-cb14-4309-8d0c-fb341511dba6\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.107557 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle\") pod \"250e18d9-cb14-4309-8d0c-fb341511dba6\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.108043 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjs7z\" (UniqueName: \"kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z\") pod \"250e18d9-cb14-4309-8d0c-fb341511dba6\" (UID: \"250e18d9-cb14-4309-8d0c-fb341511dba6\") " Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.175682 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z" (OuterVolumeSpecName: "kube-api-access-qjs7z") pod "250e18d9-cb14-4309-8d0c-fb341511dba6" (UID: "250e18d9-cb14-4309-8d0c-fb341511dba6"). InnerVolumeSpecName "kube-api-access-qjs7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.191037 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "250e18d9-cb14-4309-8d0c-fb341511dba6" (UID: "250e18d9-cb14-4309-8d0c-fb341511dba6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.211754 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.211790 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjs7z\" (UniqueName: \"kubernetes.io/projected/250e18d9-cb14-4309-8d0c-fb341511dba6-kube-api-access-qjs7z\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.228948 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.242540 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.283072 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data" (OuterVolumeSpecName: "config-data") pod "250e18d9-cb14-4309-8d0c-fb341511dba6" (UID: "250e18d9-cb14-4309-8d0c-fb341511dba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.316465 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/250e18d9-cb14-4309-8d0c-fb341511dba6-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.323747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerStarted","Data":"423aeacd820ec5c2d675794591067804e90d1f2a6923ef3a9f13012b659813bc"} Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.332078 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxrm5" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.332138 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxrm5" event={"ID":"250e18d9-cb14-4309-8d0c-fb341511dba6","Type":"ContainerDied","Data":"17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc"} Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.333387 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17906d788647666b2cd6e069e9338a17bf04443c927ea4ca63e22baf04dbc8dc" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.342184 4845 generic.go:334] "Generic (PLEG): container finished" podID="08802fb3-9897-4819-a38b-fe13e8892b47" containerID="3ed1c7658c68f0c8c4a3c24a82346d038eae78839b07b31f8eefe48619b71f5d" exitCode=0 Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.342412 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" event={"ID":"08802fb3-9897-4819-a38b-fe13e8892b47","Type":"ContainerDied","Data":"3ed1c7658c68f0c8c4a3c24a82346d038eae78839b07b31f8eefe48619b71f5d"} Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.343812 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.343830 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.453816 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c4f9db54b-5v9r8"] Feb 02 10:53:08 crc kubenswrapper[4845]: W0202 10:53:08.455114 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61e42051_311d_4b4b_af17_e301351d9267.slice/crio-6455add72597d00c7649f2b0ffe111a027f137f15effd389324caff9578aa249 WatchSource:0}: Error finding container 6455add72597d00c7649f2b0ffe111a027f137f15effd389324caff9578aa249: Status 404 returned error can't find the container with id 6455add72597d00c7649f2b0ffe111a027f137f15effd389324caff9578aa249 Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.842632 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68f64c64d8-r7nkx"] Feb 02 10:53:08 crc kubenswrapper[4845]: W0202 10:53:08.890275 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5978920a_e63d_4cb3_accd_4353fb398d50.slice/crio-e2d0394a7b810871bbdaa27e77b8a0f2bd4bd884a78bdd6ac816ce5bfbfab309 WatchSource:0}: Error finding container e2d0394a7b810871bbdaa27e77b8a0f2bd4bd884a78bdd6ac816ce5bfbfab309: Status 404 returned error can't find the container with id e2d0394a7b810871bbdaa27e77b8a0f2bd4bd884a78bdd6ac816ce5bfbfab309 Feb 02 10:53:08 crc kubenswrapper[4845]: I0202 10:53:08.990289 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.158730 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.162456 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hft5g" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.288215 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjfj8\" (UniqueName: \"kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8\") pod \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.288537 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.288602 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.288906 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle\") pod \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.288968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data\") pod \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\" (UID: \"9868fb5b-b18e-42b0-8532-6e6a55da71d2\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.289029 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.289056 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.289171 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.289210 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcx2l\" (UniqueName: \"kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l\") pod \"08802fb3-9897-4819-a38b-fe13e8892b47\" (UID: \"08802fb3-9897-4819-a38b-fe13e8892b47\") " Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.312041 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8" (OuterVolumeSpecName: "kube-api-access-cjfj8") pod "9868fb5b-b18e-42b0-8532-6e6a55da71d2" (UID: "9868fb5b-b18e-42b0-8532-6e6a55da71d2"). InnerVolumeSpecName "kube-api-access-cjfj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.332044 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9868fb5b-b18e-42b0-8532-6e6a55da71d2" (UID: "9868fb5b-b18e-42b0-8532-6e6a55da71d2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.332345 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l" (OuterVolumeSpecName: "kube-api-access-gcx2l") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "kube-api-access-gcx2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.387858 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68f64c64d8-r7nkx" event={"ID":"5978920a-e63d-4cb3-accd-4353fb398d50","Type":"ContainerStarted","Data":"a47a0f95f0ce06450c22e2ed7acfa09c8176f7a345b685a71ab421ddd8c6e97c"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.387939 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68f64c64d8-r7nkx" event={"ID":"5978920a-e63d-4cb3-accd-4353fb398d50","Type":"ContainerStarted","Data":"e2d0394a7b810871bbdaa27e77b8a0f2bd4bd884a78bdd6ac816ce5bfbfab309"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.390796 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c4f9db54b-5v9r8" event={"ID":"61e42051-311d-4b4b-af17-e301351d9267","Type":"ContainerStarted","Data":"a171327ad4eaa29de568604671f066c8d6fe4c6e25dc2476d739955a7153396d"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.390840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c4f9db54b-5v9r8" event={"ID":"61e42051-311d-4b4b-af17-e301351d9267","Type":"ContainerStarted","Data":"6455add72597d00c7649f2b0ffe111a027f137f15effd389324caff9578aa249"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.391870 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.392975 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.393001 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcx2l\" (UniqueName: \"kubernetes.io/projected/08802fb3-9897-4819-a38b-fe13e8892b47-kube-api-access-gcx2l\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.393012 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjfj8\" (UniqueName: \"kubernetes.io/projected/9868fb5b-b18e-42b0-8532-6e6a55da71d2-kube-api-access-cjfj8\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.393124 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9868fb5b-b18e-42b0-8532-6e6a55da71d2" (UID: "9868fb5b-b18e-42b0-8532-6e6a55da71d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.396548 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" event={"ID":"08802fb3-9897-4819-a38b-fe13e8892b47","Type":"ContainerDied","Data":"990f2a82fd916ae58e607d4d041c7e0245ec85e42bc4953b989f05218c6f9e19"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.397438 4845 scope.go:117] "RemoveContainer" containerID="3ed1c7658c68f0c8c4a3c24a82346d038eae78839b07b31f8eefe48619b71f5d" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.397173 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-sbm2k" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.402973 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-hft5g" event={"ID":"9868fb5b-b18e-42b0-8532-6e6a55da71d2","Type":"ContainerDied","Data":"303b261fa422f333b504b71646d5d15f21982e8b9a4a8cba35c7d99137363acc"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.403140 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="303b261fa422f333b504b71646d5d15f21982e8b9a4a8cba35c7d99137363acc" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.403332 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-hft5g" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.411171 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerStarted","Data":"72291ce4c24275105ea624fdae6cb6154c6bb75e37f637d2f201663a1789f5ec"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.411245 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerStarted","Data":"228de4e8bc7c765fb5d366d131a0a0268b9b4fac526b28962bf893b2beef69a2"} Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.420229 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c4f9db54b-5v9r8" podStartSLOduration=2.420205893 podStartE2EDuration="2.420205893s" podCreationTimestamp="2026-02-02 10:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:09.410694101 +0000 UTC m=+1270.502095561" watchObservedRunningTime="2026-02-02 10:53:09.420205893 +0000 UTC m=+1270.511607343" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.475736 4845 scope.go:117] "RemoveContainer" containerID="ad0d9ec1ecba0e033c137223df62f5b330d1ec35e21a3f0e070e5061487e39e2" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.500695 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9868fb5b-b18e-42b0-8532-6e6a55da71d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.557009 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.592853 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.608458 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.608507 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.613140 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.619290 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config" (OuterVolumeSpecName: "config") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.662548 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08802fb3-9897-4819-a38b-fe13e8892b47" (UID: "08802fb3-9897-4819-a38b-fe13e8892b47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.746854 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.746903 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.746920 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08802fb3-9897-4819-a38b-fe13e8892b47-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.898284 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-555888887b-mbz72"] Feb 02 10:53:09 crc kubenswrapper[4845]: E0202 10:53:09.898855 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="init" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.898872 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="init" Feb 02 10:53:09 crc kubenswrapper[4845]: E0202 10:53:09.898920 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="250e18d9-cb14-4309-8d0c-fb341511dba6" containerName="heat-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.898931 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="250e18d9-cb14-4309-8d0c-fb341511dba6" containerName="heat-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: E0202 10:53:09.898950 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="dnsmasq-dns" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.898958 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="dnsmasq-dns" Feb 02 10:53:09 crc kubenswrapper[4845]: E0202 10:53:09.898990 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" containerName="barbican-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.898998 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" containerName="barbican-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.899297 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" containerName="dnsmasq-dns" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.899317 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" containerName="barbican-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.899334 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="250e18d9-cb14-4309-8d0c-fb341511dba6" containerName="heat-db-sync" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.900823 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.906022 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-954bfc4f9-dfghw"] Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.915690 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.943955 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-555888887b-mbz72"] Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.954495 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-954bfc4f9-dfghw"] Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.957552 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.957797 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.957973 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rddwg" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.958128 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.985003 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:09 crc kubenswrapper[4845]: I0202 10:53:09.987027 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.038035 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075292 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-combined-ca-bundle\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075373 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29622\" (UniqueName: \"kubernetes.io/projected/a9ec709e-f840-4ba0-b631-77038f9c5551-kube-api-access-29622\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075401 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075532 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dea749b-261a-4af3-979a-127dca4af07c-logs\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075559 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data-custom\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075578 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-combined-ca-bundle\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data-custom\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075649 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf56m\" (UniqueName: \"kubernetes.io/projected/6dea749b-261a-4af3-979a-127dca4af07c-kube-api-access-rf56m\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.075693 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec709e-f840-4ba0-b631-77038f9c5551-logs\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178492 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178564 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dea749b-261a-4af3-979a-127dca4af07c-logs\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178651 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data-custom\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178713 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c85th\" (UniqueName: \"kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178747 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-combined-ca-bundle\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178797 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data-custom\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178828 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf56m\" (UniqueName: \"kubernetes.io/projected/6dea749b-261a-4af3-979a-127dca4af07c-kube-api-access-rf56m\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178859 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec709e-f840-4ba0-b631-77038f9c5551-logs\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178883 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178944 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-combined-ca-bundle\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178973 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.178999 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.179029 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29622\" (UniqueName: \"kubernetes.io/projected/a9ec709e-f840-4ba0-b631-77038f9c5551-kube-api-access-29622\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.179069 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.179101 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.179143 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.181193 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dea749b-261a-4af3-979a-127dca4af07c-logs\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.190848 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data-custom\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.197842 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9ec709e-f840-4ba0-b631-77038f9c5551-logs\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.200420 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.203057 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-combined-ca-bundle\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.204667 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-config-data-custom\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.210346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dea749b-261a-4af3-979a-127dca4af07c-config-data\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.216309 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ec709e-f840-4ba0-b631-77038f9c5551-combined-ca-bundle\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.219140 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29622\" (UniqueName: \"kubernetes.io/projected/a9ec709e-f840-4ba0-b631-77038f9c5551-kube-api-access-29622\") pod \"barbican-worker-954bfc4f9-dfghw\" (UID: \"a9ec709e-f840-4ba0-b631-77038f9c5551\") " pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.219252 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf56m\" (UniqueName: \"kubernetes.io/projected/6dea749b-261a-4af3-979a-127dca4af07c-kube-api-access-rf56m\") pod \"barbican-keystone-listener-555888887b-mbz72\" (UID: \"6dea749b-261a-4af3-979a-127dca4af07c\") " pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.229035 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.231028 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.233481 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.247060 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.269091 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-sbm2k"] Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.282474 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.282656 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.282741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.282928 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.283080 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c85th\" (UniqueName: \"kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.283269 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.284749 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.286560 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.288748 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.289954 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.290320 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.291509 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.326470 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c85th\" (UniqueName: \"kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th\") pod \"dnsmasq-dns-85ff748b95-4vk87\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.385433 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.385490 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.385539 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn24r\" (UniqueName: \"kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.385603 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.386110 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.406293 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-555888887b-mbz72" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.448416 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68f64c64d8-r7nkx" event={"ID":"5978920a-e63d-4cb3-accd-4353fb398d50","Type":"ContainerStarted","Data":"17b7917cedcfcfc5e227ee4c25eb904c009d07260f12e62f3d70f72ffa70598c"} Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.449842 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.449943 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.456987 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.457036 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.458570 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerStarted","Data":"a21d26fcb519a4c746b991dcfeca12c3245ddc62427af655c6f5de2c40b04948"} Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.459203 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.459311 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.472857 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-954bfc4f9-dfghw" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.485688 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68f64c64d8-r7nkx" podStartSLOduration=3.485663823 podStartE2EDuration="3.485663823s" podCreationTimestamp="2026-02-02 10:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:10.470114599 +0000 UTC m=+1271.561516049" watchObservedRunningTime="2026-02-02 10:53:10.485663823 +0000 UTC m=+1271.577065273" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.489392 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.489444 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.489468 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn24r\" (UniqueName: \"kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.489488 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.490391 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.491781 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.494762 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.496847 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.512488 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.517735 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7fd6897c68-cspbg" podStartSLOduration=3.517710368 podStartE2EDuration="3.517710368s" podCreationTimestamp="2026-02-02 10:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:10.512994184 +0000 UTC m=+1271.604395634" watchObservedRunningTime="2026-02-02 10:53:10.517710368 +0000 UTC m=+1271.609111818" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.522959 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn24r\" (UniqueName: \"kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r\") pod \"barbican-api-c848b759d-9s78l\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.566254 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.566612 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.566647 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.566659 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.627795 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.647009 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.693437 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:10 crc kubenswrapper[4845]: I0202 10:53:10.698062 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.221861 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-555888887b-mbz72"] Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.307105 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-954bfc4f9-dfghw"] Feb 02 10:53:11 crc kubenswrapper[4845]: W0202 10:53:11.321448 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9ec709e_f840_4ba0_b631_77038f9c5551.slice/crio-3016059426dd7a405f81151db970d25366ab2fc6bef18710c9c470df08f3648b WatchSource:0}: Error finding container 3016059426dd7a405f81151db970d25366ab2fc6bef18710c9c470df08f3648b: Status 404 returned error can't find the container with id 3016059426dd7a405f81151db970d25366ab2fc6bef18710c9c470df08f3648b Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.404036 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:11 crc kubenswrapper[4845]: W0202 10:53:11.411171 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e98b98e_a993_4000_90f3_3372541369fb.slice/crio-46aea51eef4e5bca23196251795de36a78dad7c530d09f830de8c2b61b899f53 WatchSource:0}: Error finding container 46aea51eef4e5bca23196251795de36a78dad7c530d09f830de8c2b61b899f53: Status 404 returned error can't find the container with id 46aea51eef4e5bca23196251795de36a78dad7c530d09f830de8c2b61b899f53 Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.414463 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:11 crc kubenswrapper[4845]: W0202 10:53:11.427649 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ddd7b77_b40e_4cbd_bce4_eecb7b7eae98.slice/crio-9e1e2aebf9151dc6f3a68108d64e5db12d097d08be7641fdb5a06d1241111d90 WatchSource:0}: Error finding container 9e1e2aebf9151dc6f3a68108d64e5db12d097d08be7641fdb5a06d1241111d90: Status 404 returned error can't find the container with id 9e1e2aebf9151dc6f3a68108d64e5db12d097d08be7641fdb5a06d1241111d90 Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.483840 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-954bfc4f9-dfghw" event={"ID":"a9ec709e-f840-4ba0-b631-77038f9c5551","Type":"ContainerStarted","Data":"3016059426dd7a405f81151db970d25366ab2fc6bef18710c9c470df08f3648b"} Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.493928 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-555888887b-mbz72" event={"ID":"6dea749b-261a-4af3-979a-127dca4af07c","Type":"ContainerStarted","Data":"676383dd39e311295ba943d078e853f42f0ecda45d380800c574ef6a2d9dae3e"} Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.498730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" event={"ID":"1e98b98e-a993-4000-90f3-3372541369fb","Type":"ContainerStarted","Data":"46aea51eef4e5bca23196251795de36a78dad7c530d09f830de8c2b61b899f53"} Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.503269 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerStarted","Data":"9e1e2aebf9151dc6f3a68108d64e5db12d097d08be7641fdb5a06d1241111d90"} Feb 02 10:53:11 crc kubenswrapper[4845]: I0202 10:53:11.743394 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08802fb3-9897-4819-a38b-fe13e8892b47" path="/var/lib/kubelet/pods/08802fb3-9897-4819-a38b-fe13e8892b47/volumes" Feb 02 10:53:12 crc kubenswrapper[4845]: I0202 10:53:12.529461 4845 generic.go:334] "Generic (PLEG): container finished" podID="1e98b98e-a993-4000-90f3-3372541369fb" containerID="a506c6cd7f566d8573c6029c7963e6c4b4d345abfac828b450e7916b80814864" exitCode=0 Feb 02 10:53:12 crc kubenswrapper[4845]: I0202 10:53:12.530082 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" event={"ID":"1e98b98e-a993-4000-90f3-3372541369fb","Type":"ContainerDied","Data":"a506c6cd7f566d8573c6029c7963e6c4b4d345abfac828b450e7916b80814864"} Feb 02 10:53:12 crc kubenswrapper[4845]: I0202 10:53:12.566821 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerStarted","Data":"176473ee3218eaa7c0eebfe1c20972c9902b004b5527eec40d3ad45e506fbd6b"} Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.361400 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8448c87f86-gdg49"] Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.389035 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.393016 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.393588 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.430298 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8448c87f86-gdg49"] Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.456249 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5smqk\" (UniqueName: \"kubernetes.io/projected/4d926fea-dae3-4818-a608-4d9fa52abef5-kube-api-access-5smqk\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.456372 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d926fea-dae3-4818-a608-4d9fa52abef5-logs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.456421 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data-custom\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.456465 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.456526 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-public-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.457098 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-combined-ca-bundle\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.457187 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-internal-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560591 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-combined-ca-bundle\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560653 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-internal-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560700 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5smqk\" (UniqueName: \"kubernetes.io/projected/4d926fea-dae3-4818-a608-4d9fa52abef5-kube-api-access-5smqk\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560748 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d926fea-dae3-4818-a608-4d9fa52abef5-logs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560779 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data-custom\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560812 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.560848 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-public-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.576624 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d926fea-dae3-4818-a608-4d9fa52abef5-logs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.581043 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" event={"ID":"1e98b98e-a993-4000-90f3-3372541369fb","Type":"ContainerStarted","Data":"d65ee280aa67a3b9d4ada8a3d8e251139871dc36b778ac679ad93aaaa994de0c"} Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.581204 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.584810 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerStarted","Data":"b43d9c1e2c65f2ee90a2f57f9bbc1d12c9cf5e6cdd4115a797d97d1794b7caa6"} Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.585099 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.585127 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.585556 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-combined-ca-bundle\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.587089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data-custom\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.596346 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-public-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.597115 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5smqk\" (UniqueName: \"kubernetes.io/projected/4d926fea-dae3-4818-a608-4d9fa52abef5-kube-api-access-5smqk\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.597211 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-config-data\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.597336 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d926fea-dae3-4818-a608-4d9fa52abef5-internal-tls-certs\") pod \"barbican-api-8448c87f86-gdg49\" (UID: \"4d926fea-dae3-4818-a608-4d9fa52abef5\") " pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.607363 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" podStartSLOduration=4.607337913 podStartE2EDuration="4.607337913s" podCreationTimestamp="2026-02-02 10:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:13.603843884 +0000 UTC m=+1274.695245334" watchObservedRunningTime="2026-02-02 10:53:13.607337913 +0000 UTC m=+1274.698739363" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.635329 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-c848b759d-9s78l" podStartSLOduration=3.635271041 podStartE2EDuration="3.635271041s" podCreationTimestamp="2026-02-02 10:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:13.628141887 +0000 UTC m=+1274.719543337" watchObservedRunningTime="2026-02-02 10:53:13.635271041 +0000 UTC m=+1274.726672501" Feb 02 10:53:13 crc kubenswrapper[4845]: I0202 10:53:13.746216 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:14 crc kubenswrapper[4845]: I0202 10:53:14.320333 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8448c87f86-gdg49"] Feb 02 10:53:15 crc kubenswrapper[4845]: I0202 10:53:15.620471 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8448c87f86-gdg49" event={"ID":"4d926fea-dae3-4818-a608-4d9fa52abef5","Type":"ContainerStarted","Data":"4219ffb364f9dc6fa9c34761a8a90010efdcbb6c1a8369e6dcfa7bce1ab1673d"} Feb 02 10:53:16 crc kubenswrapper[4845]: I0202 10:53:16.238098 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:53:16 crc kubenswrapper[4845]: I0202 10:53:16.238608 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:53:17 crc kubenswrapper[4845]: I0202 10:53:17.197320 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:53:17 crc kubenswrapper[4845]: I0202 10:53:17.198243 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:53:17 crc kubenswrapper[4845]: I0202 10:53:17.215275 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:17 crc kubenswrapper[4845]: I0202 10:53:17.215419 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:53:17 crc kubenswrapper[4845]: I0202 10:53:17.273243 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:53:20 crc kubenswrapper[4845]: I0202 10:53:20.631772 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:20 crc kubenswrapper[4845]: I0202 10:53:20.720227 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:53:20 crc kubenswrapper[4845]: I0202 10:53:20.720475 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="dnsmasq-dns" containerID="cri-o://b19033b665f9424ff6ceb80860a3f821b9318c6ff803e1c8ca9ff4c4824c2a24" gracePeriod=10 Feb 02 10:53:21 crc kubenswrapper[4845]: I0202 10:53:21.749181 4845 generic.go:334] "Generic (PLEG): container finished" podID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerID="b19033b665f9424ff6ceb80860a3f821b9318c6ff803e1c8ca9ff4c4824c2a24" exitCode=0 Feb 02 10:53:21 crc kubenswrapper[4845]: I0202 10:53:21.749509 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" event={"ID":"33b0a7fa-f66e-470e-95a3-a110ecec168b","Type":"ContainerDied","Data":"b19033b665f9424ff6ceb80860a3f821b9318c6ff803e1c8ca9ff4c4824c2a24"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.139643 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187378 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187459 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tnwb\" (UniqueName: \"kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187546 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187597 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.187790 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config\") pod \"33b0a7fa-f66e-470e-95a3-a110ecec168b\" (UID: \"33b0a7fa-f66e-470e-95a3-a110ecec168b\") " Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.211429 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb" (OuterVolumeSpecName: "kube-api-access-2tnwb") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "kube-api-access-2tnwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.279774 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.282212 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.286935 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config" (OuterVolumeSpecName: "config") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.290998 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tnwb\" (UniqueName: \"kubernetes.io/projected/33b0a7fa-f66e-470e-95a3-a110ecec168b-kube-api-access-2tnwb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.291024 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.291034 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.291043 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.296483 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.301361 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33b0a7fa-f66e-470e-95a3-a110ecec168b" (UID: "33b0a7fa-f66e-470e-95a3-a110ecec168b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:22 crc kubenswrapper[4845]: E0202 10:53:22.359993 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ba91fb37-4550-4684-99bb-45dba169a879" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.393723 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.393756 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33b0a7fa-f66e-470e-95a3-a110ecec168b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.419953 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.501988 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.773481 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.773779 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" event={"ID":"33b0a7fa-f66e-470e-95a3-a110ecec168b","Type":"ContainerDied","Data":"24bfd21204b757a912a5fde09fd51c09b7d4dece5bb10e7fce06a7581904b6df"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.775005 4845 scope.go:117] "RemoveContainer" containerID="b19033b665f9424ff6ceb80860a3f821b9318c6ff803e1c8ca9ff4c4824c2a24" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.798719 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="ceilometer-notification-agent" containerID="cri-o://4c795b185512cbaa08a41087a290ab376088c31c6277e1a8f0ee3f21dc22200f" gracePeriod=30 Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.799035 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerStarted","Data":"7f93106b71edc6fc8f88297c4c620682fa2e4fe0213e9e1cad77a132eee7f48a"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.799089 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.799474 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="proxy-httpd" containerID="cri-o://7f93106b71edc6fc8f88297c4c620682fa2e4fe0213e9e1cad77a132eee7f48a" gracePeriod=30 Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.799544 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="sg-core" containerID="cri-o://423aeacd820ec5c2d675794591067804e90d1f2a6923ef3a9f13012b659813bc" gracePeriod=30 Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.809541 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-555888887b-mbz72" event={"ID":"6dea749b-261a-4af3-979a-127dca4af07c","Type":"ContainerStarted","Data":"575ab7f2f7fe4e8106c2466a57472a57241cdaf710df402b30f388c226326ecb"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.809587 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-555888887b-mbz72" event={"ID":"6dea749b-261a-4af3-979a-127dca4af07c","Type":"ContainerStarted","Data":"e4fdb2540fa2b5886f2833bca4f1069d1b8cb2bad984df9fdfc72a28d64bea6a"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.832774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8448c87f86-gdg49" event={"ID":"4d926fea-dae3-4818-a608-4d9fa52abef5","Type":"ContainerStarted","Data":"275e844bce010a7a4a656fff83cd8ffa88fbe0966b827ef884820dbb3e644a13"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.833204 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.833814 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.837282 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-954bfc4f9-dfghw" event={"ID":"a9ec709e-f840-4ba0-b631-77038f9c5551","Type":"ContainerStarted","Data":"73599b0ac8c799feeb156821205d5b156f8e7048f85c646545007104aad91fbc"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.837327 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-954bfc4f9-dfghw" event={"ID":"a9ec709e-f840-4ba0-b631-77038f9c5551","Type":"ContainerStarted","Data":"977364fb0517d6e9dae517c49539321a86f1f58be0fb4dd298f952ce75d3729c"} Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.864695 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-555888887b-mbz72" podStartSLOduration=3.383382898 podStartE2EDuration="13.864666601s" podCreationTimestamp="2026-02-02 10:53:09 +0000 UTC" firstStartedPulling="2026-02-02 10:53:11.18846511 +0000 UTC m=+1272.279866570" lastFinishedPulling="2026-02-02 10:53:21.669748823 +0000 UTC m=+1282.761150273" observedRunningTime="2026-02-02 10:53:22.845595436 +0000 UTC m=+1283.936996886" watchObservedRunningTime="2026-02-02 10:53:22.864666601 +0000 UTC m=+1283.956068051" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.922908 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8448c87f86-gdg49" podStartSLOduration=9.922875173 podStartE2EDuration="9.922875173s" podCreationTimestamp="2026-02-02 10:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:22.873930255 +0000 UTC m=+1283.965331705" watchObservedRunningTime="2026-02-02 10:53:22.922875173 +0000 UTC m=+1284.014276633" Feb 02 10:53:22 crc kubenswrapper[4845]: I0202 10:53:22.962326 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-954bfc4f9-dfghw" podStartSLOduration=3.618827651 podStartE2EDuration="13.962303789s" podCreationTimestamp="2026-02-02 10:53:09 +0000 UTC" firstStartedPulling="2026-02-02 10:53:11.326217473 +0000 UTC m=+1272.417618923" lastFinishedPulling="2026-02-02 10:53:21.669693611 +0000 UTC m=+1282.761095061" observedRunningTime="2026-02-02 10:53:22.903083798 +0000 UTC m=+1283.994485248" watchObservedRunningTime="2026-02-02 10:53:22.962303789 +0000 UTC m=+1284.053705239" Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.021122 4845 scope.go:117] "RemoveContainer" containerID="720ce281ce34166ec16d662e6b2cfb3e73984fcbc45102640b4bba9183262c78" Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.024099 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.034090 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-mrvw8"] Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.730844 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" path="/var/lib/kubelet/pods/33b0a7fa-f66e-470e-95a3-a110ecec168b/volumes" Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.916943 4845 generic.go:334] "Generic (PLEG): container finished" podID="ba91fb37-4550-4684-99bb-45dba169a879" containerID="7f93106b71edc6fc8f88297c4c620682fa2e4fe0213e9e1cad77a132eee7f48a" exitCode=0 Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.916988 4845 generic.go:334] "Generic (PLEG): container finished" podID="ba91fb37-4550-4684-99bb-45dba169a879" containerID="423aeacd820ec5c2d675794591067804e90d1f2a6923ef3a9f13012b659813bc" exitCode=2 Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917001 4845 generic.go:334] "Generic (PLEG): container finished" podID="ba91fb37-4550-4684-99bb-45dba169a879" containerID="4c795b185512cbaa08a41087a290ab376088c31c6277e1a8f0ee3f21dc22200f" exitCode=0 Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerDied","Data":"7f93106b71edc6fc8f88297c4c620682fa2e4fe0213e9e1cad77a132eee7f48a"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917146 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerDied","Data":"423aeacd820ec5c2d675794591067804e90d1f2a6923ef3a9f13012b659813bc"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917167 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerDied","Data":"4c795b185512cbaa08a41087a290ab376088c31c6277e1a8f0ee3f21dc22200f"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917178 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba91fb37-4550-4684-99bb-45dba169a879","Type":"ContainerDied","Data":"320ae8c7183d27e383573443ad820bda2273505766b721abc86327e40604e0ca"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.917205 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="320ae8c7183d27e383573443ad820bda2273505766b721abc86327e40604e0ca" Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.925656 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g8b4r" event={"ID":"183b0ef9-490f-43a1-a464-2bd64a820ebd","Type":"ContainerStarted","Data":"b6c0a29825672ec889a1c4e9480e6e2959d05e730a7944a0c2ac39bff41e3be4"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.938319 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.948414 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8448c87f86-gdg49" event={"ID":"4d926fea-dae3-4818-a608-4d9fa52abef5","Type":"ContainerStarted","Data":"052db1f414ef06472a20f02fff0ddd12feda87931a4d2148b7c9490e7c8e3c47"} Feb 02 10:53:23 crc kubenswrapper[4845]: I0202 10:53:23.949721 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g8b4r" podStartSLOduration=5.241277522 podStartE2EDuration="58.949698981s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="2026-02-02 10:52:27.962034974 +0000 UTC m=+1229.053436424" lastFinishedPulling="2026-02-02 10:53:21.670456433 +0000 UTC m=+1282.761857883" observedRunningTime="2026-02-02 10:53:23.949513026 +0000 UTC m=+1285.040914476" watchObservedRunningTime="2026-02-02 10:53:23.949698981 +0000 UTC m=+1285.041100431" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.054550 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.054923 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.055112 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.055148 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.055299 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.055358 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7v5\" (UniqueName: \"kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.055381 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd\") pod \"ba91fb37-4550-4684-99bb-45dba169a879\" (UID: \"ba91fb37-4550-4684-99bb-45dba169a879\") " Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.056050 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.056252 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.056913 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.056937 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba91fb37-4550-4684-99bb-45dba169a879-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.066180 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5" (OuterVolumeSpecName: "kube-api-access-4g7v5") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "kube-api-access-4g7v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.086848 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts" (OuterVolumeSpecName: "scripts") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.092984 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.160373 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.160413 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.160422 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7v5\" (UniqueName: \"kubernetes.io/projected/ba91fb37-4550-4684-99bb-45dba169a879-kube-api-access-4g7v5\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.163052 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.214471 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data" (OuterVolumeSpecName: "config-data") pod "ba91fb37-4550-4684-99bb-45dba169a879" (UID: "ba91fb37-4550-4684-99bb-45dba169a879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.263026 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.263064 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba91fb37-4550-4684-99bb-45dba169a879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:24 crc kubenswrapper[4845]: I0202 10:53:24.959910 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.035324 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.050576 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.063788 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:25 crc kubenswrapper[4845]: E0202 10:53:25.064426 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="sg-core" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064453 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="sg-core" Feb 02 10:53:25 crc kubenswrapper[4845]: E0202 10:53:25.064490 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="proxy-httpd" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064501 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="proxy-httpd" Feb 02 10:53:25 crc kubenswrapper[4845]: E0202 10:53:25.064524 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="dnsmasq-dns" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064534 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="dnsmasq-dns" Feb 02 10:53:25 crc kubenswrapper[4845]: E0202 10:53:25.064563 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="ceilometer-notification-agent" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064573 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="ceilometer-notification-agent" Feb 02 10:53:25 crc kubenswrapper[4845]: E0202 10:53:25.064589 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="init" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064598 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="init" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064914 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="dnsmasq-dns" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064939 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="ceilometer-notification-agent" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064956 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="sg-core" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.064973 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba91fb37-4550-4684-99bb-45dba169a879" containerName="proxy-httpd" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.067187 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.070421 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.070769 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.085646 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z6ch\" (UniqueName: \"kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185316 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185378 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185435 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.185460 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.287069 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.287139 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z6ch\" (UniqueName: \"kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.287190 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.287230 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.287760 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.288100 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.288348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.288427 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.288728 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.292180 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.292514 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.293483 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.294720 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.315204 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z6ch\" (UniqueName: \"kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch\") pod \"ceilometer-0\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.399735 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4845]: I0202 10:53:25.729320 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba91fb37-4550-4684-99bb-45dba169a879" path="/var/lib/kubelet/pods/ba91fb37-4550-4684-99bb-45dba169a879/volumes" Feb 02 10:53:26 crc kubenswrapper[4845]: I0202 10:53:26.027647 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4845]: I0202 10:53:26.992959 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerStarted","Data":"fbe9c7e888f7c781c1053770c1b8f9cfc807348795f8c409217a85d5272ec120"} Feb 02 10:53:26 crc kubenswrapper[4845]: I0202 10:53:26.993268 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerStarted","Data":"253fc8db05e07f1a530e09a4e9ff070908466c9701a4e2cca1a5c237104581b4"} Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.081148 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-mrvw8" podUID="33b0a7fa-f66e-470e-95a3-a110ecec168b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: i/o timeout" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.271699 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.569337 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.574423 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6898599c95-65qmn" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-api" containerID="cri-o://c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618" gracePeriod=30 Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.574999 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6898599c95-65qmn" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" containerID="cri-o://9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d" gracePeriod=30 Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.618665 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b8db7b6ff-lx6zl"] Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.620877 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.654010 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b8db7b6ff-lx6zl"] Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.686737 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6898599c95-65qmn" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.197:9696/\": read tcp 10.217.0.2:37340->10.217.0.197:9696: read: connection reset by peer" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.763701 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-combined-ca-bundle\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.763804 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7mj\" (UniqueName: \"kubernetes.io/projected/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-kube-api-access-vr7mj\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.763839 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-httpd-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.764074 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-ovndb-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.764171 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.764261 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-public-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.764344 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-internal-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866215 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-ovndb-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866276 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866319 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-public-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866373 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-internal-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866414 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-combined-ca-bundle\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866469 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7mj\" (UniqueName: \"kubernetes.io/projected/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-kube-api-access-vr7mj\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.866486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-httpd-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.872103 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-internal-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.872134 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-public-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.872853 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-ovndb-tls-certs\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.873212 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.874418 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-httpd-config\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.877301 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-combined-ca-bundle\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.886508 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7mj\" (UniqueName: \"kubernetes.io/projected/7b4befc3-7f3f-4813-9c5e-9fac28d60f72-kube-api-access-vr7mj\") pod \"neutron-5b8db7b6ff-lx6zl\" (UID: \"7b4befc3-7f3f-4813-9c5e-9fac28d60f72\") " pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:27 crc kubenswrapper[4845]: I0202 10:53:27.948903 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:28 crc kubenswrapper[4845]: I0202 10:53:28.010740 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerStarted","Data":"cf0af5c5e3f564b6556befcc6a6f25d0bf34be4299c282d46a9c4ef4ba6b6020"} Feb 02 10:53:28 crc kubenswrapper[4845]: I0202 10:53:28.013711 4845 generic.go:334] "Generic (PLEG): container finished" podID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerID="9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d" exitCode=0 Feb 02 10:53:28 crc kubenswrapper[4845]: I0202 10:53:28.013749 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerDied","Data":"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d"} Feb 02 10:53:28 crc kubenswrapper[4845]: I0202 10:53:28.587472 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b8db7b6ff-lx6zl"] Feb 02 10:53:29 crc kubenswrapper[4845]: I0202 10:53:29.030534 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerStarted","Data":"667445270cb8d9e7bc55a98bf683b24cb0b8dad3cdc9fba27ac254a584303435"} Feb 02 10:53:29 crc kubenswrapper[4845]: I0202 10:53:29.036786 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8db7b6ff-lx6zl" event={"ID":"7b4befc3-7f3f-4813-9c5e-9fac28d60f72","Type":"ContainerStarted","Data":"010d122d3c23baef91e81ddd89b34876f952d908b5e75ad031480c0b8b37cfcf"} Feb 02 10:53:29 crc kubenswrapper[4845]: I0202 10:53:29.036868 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8db7b6ff-lx6zl" event={"ID":"7b4befc3-7f3f-4813-9c5e-9fac28d60f72","Type":"ContainerStarted","Data":"5b48343b7812c2a915ac1190e7ed3c58017b026297b081721f298a101490067a"} Feb 02 10:53:29 crc kubenswrapper[4845]: I0202 10:53:29.434045 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-6898599c95-65qmn" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.197:9696/\": dial tcp 10.217.0.197:9696: connect: connection refused" Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.070372 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b8db7b6ff-lx6zl" event={"ID":"7b4befc3-7f3f-4813-9c5e-9fac28d60f72","Type":"ContainerStarted","Data":"e716c03f22b8b84ac51f1230761a01d739c457d9b43950e32626c7ae7a66172f"} Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.072025 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.824849 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.850845 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b8db7b6ff-lx6zl" podStartSLOduration=3.850801382 podStartE2EDuration="3.850801382s" podCreationTimestamp="2026-02-02 10:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:30.103297199 +0000 UTC m=+1291.194698649" watchObservedRunningTime="2026-02-02 10:53:30.850801382 +0000 UTC m=+1291.942202852" Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.879614 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8448c87f86-gdg49" Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.939297 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.939563 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-c848b759d-9s78l" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api-log" containerID="cri-o://176473ee3218eaa7c0eebfe1c20972c9902b004b5527eec40d3ad45e506fbd6b" gracePeriod=30 Feb 02 10:53:30 crc kubenswrapper[4845]: I0202 10:53:30.939713 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-c848b759d-9s78l" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api" containerID="cri-o://b43d9c1e2c65f2ee90a2f57f9bbc1d12c9cf5e6cdd4115a797d97d1794b7caa6" gracePeriod=30 Feb 02 10:53:31 crc kubenswrapper[4845]: I0202 10:53:31.094161 4845 generic.go:334] "Generic (PLEG): container finished" podID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerID="176473ee3218eaa7c0eebfe1c20972c9902b004b5527eec40d3ad45e506fbd6b" exitCode=143 Feb 02 10:53:31 crc kubenswrapper[4845]: I0202 10:53:31.094521 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerDied","Data":"176473ee3218eaa7c0eebfe1c20972c9902b004b5527eec40d3ad45e506fbd6b"} Feb 02 10:53:31 crc kubenswrapper[4845]: I0202 10:53:31.109837 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerStarted","Data":"8728f9654023c3d16f466f99002e731e63223d9431b4fa5e2833683a43d715a8"} Feb 02 10:53:31 crc kubenswrapper[4845]: I0202 10:53:31.110492 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:53:31 crc kubenswrapper[4845]: I0202 10:53:31.134508 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.760753242 podStartE2EDuration="6.134487722s" podCreationTimestamp="2026-02-02 10:53:25 +0000 UTC" firstStartedPulling="2026-02-02 10:53:26.038426008 +0000 UTC m=+1287.129827458" lastFinishedPulling="2026-02-02 10:53:30.412160498 +0000 UTC m=+1291.503561938" observedRunningTime="2026-02-02 10:53:31.133020851 +0000 UTC m=+1292.224422311" watchObservedRunningTime="2026-02-02 10:53:31.134487722 +0000 UTC m=+1292.225889172" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.062378 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.122042 4845 generic.go:334] "Generic (PLEG): container finished" podID="183b0ef9-490f-43a1-a464-2bd64a820ebd" containerID="b6c0a29825672ec889a1c4e9480e6e2959d05e730a7944a0c2ac39bff41e3be4" exitCode=0 Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.122114 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g8b4r" event={"ID":"183b0ef9-490f-43a1-a464-2bd64a820ebd","Type":"ContainerDied","Data":"b6c0a29825672ec889a1c4e9480e6e2959d05e730a7944a0c2ac39bff41e3be4"} Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.128725 4845 generic.go:334] "Generic (PLEG): container finished" podID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerID="c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618" exitCode=0 Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.128774 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6898599c95-65qmn" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.128855 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerDied","Data":"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618"} Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.129027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6898599c95-65qmn" event={"ID":"0d345372-d7c4-4094-b9cb-e2afbd2dbf54","Type":"ContainerDied","Data":"00ea85117a3a7a613efb0ee0b197731f8faf53d379b635789e63fb18dc175257"} Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.129048 4845 scope.go:117] "RemoveContainer" containerID="9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.154900 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.155178 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.155319 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.155428 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.156054 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9ml\" (UniqueName: \"kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.156186 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.156326 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config\") pod \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\" (UID: \"0d345372-d7c4-4094-b9cb-e2afbd2dbf54\") " Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.165704 4845 scope.go:117] "RemoveContainer" containerID="c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.188102 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.190135 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml" (OuterVolumeSpecName: "kube-api-access-mk9ml") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "kube-api-access-mk9ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.261578 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk9ml\" (UniqueName: \"kubernetes.io/projected/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-kube-api-access-mk9ml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.261823 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.277264 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.284158 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config" (OuterVolumeSpecName: "config") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.286012 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.302008 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.318092 4845 scope.go:117] "RemoveContainer" containerID="9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d" Feb 02 10:53:32 crc kubenswrapper[4845]: E0202 10:53:32.318836 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d\": container with ID starting with 9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d not found: ID does not exist" containerID="9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.318936 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d"} err="failed to get container status \"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d\": rpc error: code = NotFound desc = could not find container \"9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d\": container with ID starting with 9c9dd7addf0d5bf3f26f958d7335764b784970862b4639f903863e4dcd0c828d not found: ID does not exist" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.318977 4845 scope.go:117] "RemoveContainer" containerID="c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618" Feb 02 10:53:32 crc kubenswrapper[4845]: E0202 10:53:32.322140 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618\": container with ID starting with c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618 not found: ID does not exist" containerID="c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.322218 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618"} err="failed to get container status \"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618\": rpc error: code = NotFound desc = could not find container \"c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618\": container with ID starting with c741a0827d7fd6ab35372c65a855c22f8cc1528974b1ae3a9963caed9499f618 not found: ID does not exist" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.326293 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0d345372-d7c4-4094-b9cb-e2afbd2dbf54" (UID: "0d345372-d7c4-4094-b9cb-e2afbd2dbf54"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.368062 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.368302 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.368402 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.368473 4845 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.368540 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d345372-d7c4-4094-b9cb-e2afbd2dbf54-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.526623 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:53:32 crc kubenswrapper[4845]: I0202 10:53:32.539429 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6898599c95-65qmn"] Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.563862 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.594770 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflw4\" (UniqueName: \"kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.594956 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.595028 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.595080 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.595151 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.595198 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts\") pod \"183b0ef9-490f-43a1-a464-2bd64a820ebd\" (UID: \"183b0ef9-490f-43a1-a464-2bd64a820ebd\") " Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.600988 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts" (OuterVolumeSpecName: "scripts") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.603814 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4" (OuterVolumeSpecName: "kube-api-access-rflw4") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "kube-api-access-rflw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.612782 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.613027 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.641747 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.667266 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data" (OuterVolumeSpecName: "config-data") pod "183b0ef9-490f-43a1-a464-2bd64a820ebd" (UID: "183b0ef9-490f-43a1-a464-2bd64a820ebd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698541 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698577 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflw4\" (UniqueName: \"kubernetes.io/projected/183b0ef9-490f-43a1-a464-2bd64a820ebd-kube-api-access-rflw4\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698591 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698600 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698608 4845 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/183b0ef9-490f-43a1-a464-2bd64a820ebd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.698616 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183b0ef9-490f-43a1-a464-2bd64a820ebd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:33 crc kubenswrapper[4845]: I0202 10:53:33.726200 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" path="/var/lib/kubelet/pods/0d345372-d7c4-4094-b9cb-e2afbd2dbf54/volumes" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.155747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g8b4r" event={"ID":"183b0ef9-490f-43a1-a464-2bd64a820ebd","Type":"ContainerDied","Data":"8487cd80461a5551ea17adb0f75cb6e4ce51ee5bd0eda70e468bd0162117e3f9"} Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.156132 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8487cd80461a5551ea17adb0f75cb6e4ce51ee5bd0eda70e468bd0162117e3f9" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.155980 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g8b4r" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.415461 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:34 crc kubenswrapper[4845]: E0202 10:53:34.425109 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425142 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" Feb 02 10:53:34 crc kubenswrapper[4845]: E0202 10:53:34.425167 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-api" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425173 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-api" Feb 02 10:53:34 crc kubenswrapper[4845]: E0202 10:53:34.425205 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" containerName="cinder-db-sync" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425211 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" containerName="cinder-db-sync" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425530 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" containerName="cinder-db-sync" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425562 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-httpd" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.425575 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d345372-d7c4-4094-b9cb-e2afbd2dbf54" containerName="neutron-api" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.426742 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.447944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.448594 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2glkz" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.448838 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.449007 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.452123 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.514748 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.514841 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlb2n\" (UniqueName: \"kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.514970 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.515070 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.515153 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.515212 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.527764 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.530297 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.543155 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620186 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620262 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620333 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620391 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlb2n\" (UniqueName: \"kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620513 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620606 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.620727 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.653730 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlb2n\" (UniqueName: \"kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.658793 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.667400 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.678328 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.705931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.721771 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.722875 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.722935 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.722980 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.725984 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.726030 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98wr\" (UniqueName: \"kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.726141 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.728227 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.730679 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.752033 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.772124 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829066 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829111 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829144 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829271 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829289 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q98wr\" (UniqueName: \"kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.829323 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.830248 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.831599 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.832304 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.833054 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.838314 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.862440 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q98wr\" (UniqueName: \"kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr\") pod \"dnsmasq-dns-5c9776ccc5-qspp8\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952388 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952515 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952592 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952636 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8jww\" (UniqueName: \"kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952678 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952754 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:34 crc kubenswrapper[4845]: I0202 10:53:34.952940 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.013031 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058683 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058719 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8jww\" (UniqueName: \"kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058755 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058816 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.058972 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.059080 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.059557 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.075421 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.083516 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.083776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.092031 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.093597 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8jww\" (UniqueName: \"kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww\") pod \"cinder-api-0\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.195769 4845 generic.go:334] "Generic (PLEG): container finished" podID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerID="b43d9c1e2c65f2ee90a2f57f9bbc1d12c9cf5e6cdd4115a797d97d1794b7caa6" exitCode=0 Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.195823 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerDied","Data":"b43d9c1e2c65f2ee90a2f57f9bbc1d12c9cf5e6cdd4115a797d97d1794b7caa6"} Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.229990 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.330656 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.373356 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom\") pod \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.373454 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn24r\" (UniqueName: \"kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r\") pod \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.373663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data\") pod \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.373736 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs\") pod \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.373930 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle\") pod \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\" (UID: \"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98\") " Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.380018 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" (UID: "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.385692 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs" (OuterVolumeSpecName: "logs") pod "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" (UID: "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.388401 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r" (OuterVolumeSpecName: "kube-api-access-fn24r") pod "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" (UID: "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98"). InnerVolumeSpecName "kube-api-access-fn24r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.448498 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" (UID: "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.478170 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.478210 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.478226 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn24r\" (UniqueName: \"kubernetes.io/projected/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-kube-api-access-fn24r\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.478239 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.484592 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data" (OuterVolumeSpecName: "config-data") pod "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" (UID: "7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.534193 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.579927 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:35 crc kubenswrapper[4845]: I0202 10:53:35.764445 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.017154 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:53:36 crc kubenswrapper[4845]: W0202 10:53:36.019163 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06105adf_bd97_410f_922f_cb54a637955d.slice/crio-10b5879f559c2a482eb1195a6f65f86b7381b0fca31a6c70d7384ff0ddf1d778 WatchSource:0}: Error finding container 10b5879f559c2a482eb1195a6f65f86b7381b0fca31a6c70d7384ff0ddf1d778: Status 404 returned error can't find the container with id 10b5879f559c2a482eb1195a6f65f86b7381b0fca31a6c70d7384ff0ddf1d778 Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.210905 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerStarted","Data":"10b5879f559c2a482eb1195a6f65f86b7381b0fca31a6c70d7384ff0ddf1d778"} Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.212389 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerStarted","Data":"27e0db0e179d31ce7ea0e79507fa9b8ddbc8dd15b66fff98db116d1b86140fed"} Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.219179 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-c848b759d-9s78l" event={"ID":"7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98","Type":"ContainerDied","Data":"9e1e2aebf9151dc6f3a68108d64e5db12d097d08be7641fdb5a06d1241111d90"} Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.219235 4845 scope.go:117] "RemoveContainer" containerID="b43d9c1e2c65f2ee90a2f57f9bbc1d12c9cf5e6cdd4115a797d97d1794b7caa6" Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.219368 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-c848b759d-9s78l" Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.243215 4845 generic.go:334] "Generic (PLEG): container finished" podID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerID="de805387ebe60937953bfa8ca82aa39520aa199cff779fd5003f9f946eb62840" exitCode=0 Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.243261 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" event={"ID":"f57453e0-7229-4521-9d4a-769dc8c888fa","Type":"ContainerDied","Data":"de805387ebe60937953bfa8ca82aa39520aa199cff779fd5003f9f946eb62840"} Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.243308 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" event={"ID":"f57453e0-7229-4521-9d4a-769dc8c888fa","Type":"ContainerStarted","Data":"4772888bdcd0116c54162c4f207b2005eb03b8a93b86ecf4467b09a24d45e538"} Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.247922 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.258816 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-c848b759d-9s78l"] Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.373781 4845 scope.go:117] "RemoveContainer" containerID="176473ee3218eaa7c0eebfe1c20972c9902b004b5527eec40d3ad45e506fbd6b" Feb 02 10:53:36 crc kubenswrapper[4845]: I0202 10:53:36.519259 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.269409 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" event={"ID":"f57453e0-7229-4521-9d4a-769dc8c888fa","Type":"ContainerStarted","Data":"e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2"} Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.269721 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.279412 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerStarted","Data":"0b451f378e885ddb27d4b9a42dcd386c2a29d3bf05ddbf776583d1ff3fa31571"} Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.282818 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerStarted","Data":"36d898e83e0f0d7e2fb72b22a68f13dc46338bf7f3eccf52dcd8d9142d72c38d"} Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.292823 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" podStartSLOduration=3.292804764 podStartE2EDuration="3.292804764s" podCreationTimestamp="2026-02-02 10:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:37.290270461 +0000 UTC m=+1298.381671951" watchObservedRunningTime="2026-02-02 10:53:37.292804764 +0000 UTC m=+1298.384206204" Feb 02 10:53:37 crc kubenswrapper[4845]: I0202 10:53:37.728347 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" path="/var/lib/kubelet/pods/7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98/volumes" Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.295389 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerStarted","Data":"ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709"} Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.295526 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api-log" containerID="cri-o://0b451f378e885ddb27d4b9a42dcd386c2a29d3bf05ddbf776583d1ff3fa31571" gracePeriod=30 Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.295567 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.295611 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api" containerID="cri-o://ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709" gracePeriod=30 Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.303150 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerStarted","Data":"1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5"} Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.323399 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.323375549 podStartE2EDuration="4.323375549s" podCreationTimestamp="2026-02-02 10:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:38.314303968 +0000 UTC m=+1299.405705418" watchObservedRunningTime="2026-02-02 10:53:38.323375549 +0000 UTC m=+1299.414776999" Feb 02 10:53:38 crc kubenswrapper[4845]: I0202 10:53:38.348262 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.489098286 podStartE2EDuration="4.348245045s" podCreationTimestamp="2026-02-02 10:53:34 +0000 UTC" firstStartedPulling="2026-02-02 10:53:35.589417975 +0000 UTC m=+1296.680819425" lastFinishedPulling="2026-02-02 10:53:36.448564734 +0000 UTC m=+1297.539966184" observedRunningTime="2026-02-02 10:53:38.339684899 +0000 UTC m=+1299.431086359" watchObservedRunningTime="2026-02-02 10:53:38.348245045 +0000 UTC m=+1299.439646495" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.313006 4845 generic.go:334] "Generic (PLEG): container finished" podID="06105adf-bd97-410f-922f-cb54a637955d" containerID="0b451f378e885ddb27d4b9a42dcd386c2a29d3bf05ddbf776583d1ff3fa31571" exitCode=143 Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.313112 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerDied","Data":"0b451f378e885ddb27d4b9a42dcd386c2a29d3bf05ddbf776583d1ff3fa31571"} Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.609414 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.619114 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68f64c64d8-r7nkx" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.694288 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.694652 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" containerID="cri-o://72291ce4c24275105ea624fdae6cb6154c6bb75e37f637d2f201663a1789f5ec" gracePeriod=30 Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.695192 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" containerID="cri-o://a21d26fcb519a4c746b991dcfeca12c3245ddc62427af655c6f5de2c40b04948" gracePeriod=30 Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.699945 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.700094 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.700411 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.700722 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.707837 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.708053 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-7fd6897c68-cspbg" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.199:8778/\": EOF" Feb 02 10:53:39 crc kubenswrapper[4845]: I0202 10:53:39.810541 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:53:40 crc kubenswrapper[4845]: I0202 10:53:40.056270 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c4f9db54b-5v9r8" Feb 02 10:53:40 crc kubenswrapper[4845]: I0202 10:53:40.325322 4845 generic.go:334] "Generic (PLEG): container finished" podID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerID="72291ce4c24275105ea624fdae6cb6154c6bb75e37f637d2f201663a1789f5ec" exitCode=143 Feb 02 10:53:40 crc kubenswrapper[4845]: I0202 10:53:40.325440 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerDied","Data":"72291ce4c24275105ea624fdae6cb6154c6bb75e37f637d2f201663a1789f5ec"} Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.386961 4845 generic.go:334] "Generic (PLEG): container finished" podID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerID="a21d26fcb519a4c746b991dcfeca12c3245ddc62427af655c6f5de2c40b04948" exitCode=0 Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.387050 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerDied","Data":"a21d26fcb519a4c746b991dcfeca12c3245ddc62427af655c6f5de2c40b04948"} Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.574397 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706090 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706232 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706366 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706436 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll5b8\" (UniqueName: \"kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706603 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706671 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.706749 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.708476 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs" (OuterVolumeSpecName: "logs") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.713457 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8" (OuterVolumeSpecName: "kube-api-access-ll5b8") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "kube-api-access-ll5b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.715194 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts" (OuterVolumeSpecName: "scripts") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.821531 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.821569 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3231a338-4ba7-4851-9fd5-a7ba84f13089-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.821583 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll5b8\" (UniqueName: \"kubernetes.io/projected/3231a338-4ba7-4851-9fd5-a7ba84f13089-kube-api-access-ll5b8\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.824585 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.848829 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 10:53:43 crc kubenswrapper[4845]: E0202 10:53:43.849632 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851091 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api" Feb 02 10:53:43 crc kubenswrapper[4845]: E0202 10:53:43.851128 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851173 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" Feb 02 10:53:43 crc kubenswrapper[4845]: E0202 10:53:43.851193 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851200 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" Feb 02 10:53:43 crc kubenswrapper[4845]: E0202 10:53:43.851213 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api-log" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851220 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api-log" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851516 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-api" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851585 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851600 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" containerName="placement-log" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.851618 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ddd7b77-b40e-4cbd-bce4-eecb7b7eae98" containerName="barbican-api-log" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.852795 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.852940 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.856948 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.857161 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.857343 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-xtwwc" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.863031 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data" (OuterVolumeSpecName: "config-data") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.894987 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.922683 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.923537 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") pod \"3231a338-4ba7-4851-9fd5-a7ba84f13089\" (UID: \"3231a338-4ba7-4851-9fd5-a7ba84f13089\") " Feb 02 10:53:43 crc kubenswrapper[4845]: W0202 10:53:43.923697 4845 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3231a338-4ba7-4851-9fd5-a7ba84f13089/volumes/kubernetes.io~secret/internal-tls-certs Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.923730 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3231a338-4ba7-4851-9fd5-a7ba84f13089" (UID: "3231a338-4ba7-4851-9fd5-a7ba84f13089"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.923897 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8kc6\" (UniqueName: \"kubernetes.io/projected/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-kube-api-access-f8kc6\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.923949 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.923977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.924324 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.924747 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.924765 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.924776 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:43 crc kubenswrapper[4845]: I0202 10:53:43.924786 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3231a338-4ba7-4851-9fd5-a7ba84f13089-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.026487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.026695 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.027057 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8kc6\" (UniqueName: \"kubernetes.io/projected/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-kube-api-access-f8kc6\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.027135 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.029432 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.032302 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.033367 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.046019 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8kc6\" (UniqueName: \"kubernetes.io/projected/c10a41f9-4bda-4d90-81c1-09ed21f00b2b-kube-api-access-f8kc6\") pod \"openstackclient\" (UID: \"c10a41f9-4bda-4d90-81c1-09ed21f00b2b\") " pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.181837 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.412369 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fd6897c68-cspbg" event={"ID":"3231a338-4ba7-4851-9fd5-a7ba84f13089","Type":"ContainerDied","Data":"228de4e8bc7c765fb5d366d131a0a0268b9b4fac526b28962bf893b2beef69a2"} Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.412733 4845 scope.go:117] "RemoveContainer" containerID="a21d26fcb519a4c746b991dcfeca12c3245ddc62427af655c6f5de2c40b04948" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.412449 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fd6897c68-cspbg" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.452780 4845 scope.go:117] "RemoveContainer" containerID="72291ce4c24275105ea624fdae6cb6154c6bb75e37f637d2f201663a1789f5ec" Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.457955 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.467506 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7fd6897c68-cspbg"] Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.683963 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:53:44 crc kubenswrapper[4845]: W0202 10:53:44.684525 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc10a41f9_4bda_4d90_81c1_09ed21f00b2b.slice/crio-d77ee8bee66783f8327bc67c1ef3799f9c9d870eca7bb6938cdb74b592de47d9 WatchSource:0}: Error finding container d77ee8bee66783f8327bc67c1ef3799f9c9d870eca7bb6938cdb74b592de47d9: Status 404 returned error can't find the container with id d77ee8bee66783f8327bc67c1ef3799f9c9d870eca7bb6938cdb74b592de47d9 Feb 02 10:53:44 crc kubenswrapper[4845]: I0202 10:53:44.984380 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.014053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.069111 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.125502 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.125736 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="dnsmasq-dns" containerID="cri-o://d65ee280aa67a3b9d4ada8a3d8e251139871dc36b778ac679ad93aaaa994de0c" gracePeriod=10 Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.426242 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c10a41f9-4bda-4d90-81c1-09ed21f00b2b","Type":"ContainerStarted","Data":"d77ee8bee66783f8327bc67c1ef3799f9c9d870eca7bb6938cdb74b592de47d9"} Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.438066 4845 generic.go:334] "Generic (PLEG): container finished" podID="1e98b98e-a993-4000-90f3-3372541369fb" containerID="d65ee280aa67a3b9d4ada8a3d8e251139871dc36b778ac679ad93aaaa994de0c" exitCode=0 Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.438135 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" event={"ID":"1e98b98e-a993-4000-90f3-3372541369fb","Type":"ContainerDied","Data":"d65ee280aa67a3b9d4ada8a3d8e251139871dc36b778ac679ad93aaaa994de0c"} Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.444916 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="cinder-scheduler" containerID="cri-o://36d898e83e0f0d7e2fb72b22a68f13dc46338bf7f3eccf52dcd8d9142d72c38d" gracePeriod=30 Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.444969 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="probe" containerID="cri-o://1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5" gracePeriod=30 Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.768282 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.768766 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3231a338-4ba7-4851-9fd5-a7ba84f13089" path="/var/lib/kubelet/pods/3231a338-4ba7-4851-9fd5-a7ba84f13089/volumes" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.771709 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.771782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c85th\" (UniqueName: \"kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.772958 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.773042 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.773130 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.773188 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config\") pod \"1e98b98e-a993-4000-90f3-3372541369fb\" (UID: \"1e98b98e-a993-4000-90f3-3372541369fb\") " Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.783915 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th" (OuterVolumeSpecName: "kube-api-access-c85th") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "kube-api-access-c85th". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.876143 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c85th\" (UniqueName: \"kubernetes.io/projected/1e98b98e-a993-4000-90f3-3372541369fb-kube-api-access-c85th\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.894407 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.930479 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.932646 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.946328 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.949423 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config" (OuterVolumeSpecName: "config") pod "1e98b98e-a993-4000-90f3-3372541369fb" (UID: "1e98b98e-a993-4000-90f3-3372541369fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.976880 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.977145 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.977156 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.977165 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4845]: I0202 10:53:45.977174 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e98b98e-a993-4000-90f3-3372541369fb-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:46 crc kubenswrapper[4845]: E0202 10:53:46.196919 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda14c1bc_4bf5_451d_b547_f4695a1f1099.slice/crio-conmon-1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.237461 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.237551 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.474101 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" event={"ID":"1e98b98e-a993-4000-90f3-3372541369fb","Type":"ContainerDied","Data":"46aea51eef4e5bca23196251795de36a78dad7c530d09f830de8c2b61b899f53"} Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.474166 4845 scope.go:117] "RemoveContainer" containerID="d65ee280aa67a3b9d4ada8a3d8e251139871dc36b778ac679ad93aaaa994de0c" Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.474322 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.482271 4845 generic.go:334] "Generic (PLEG): container finished" podID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerID="1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5" exitCode=0 Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.482326 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerDied","Data":"1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5"} Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.520008 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.523300 4845 scope.go:117] "RemoveContainer" containerID="a506c6cd7f566d8573c6029c7963e6c4b4d345abfac828b450e7916b80814864" Feb 02 10:53:46 crc kubenswrapper[4845]: I0202 10:53:46.546866 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-4vk87"] Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.736380 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e98b98e-a993-4000-90f3-3372541369fb" path="/var/lib/kubelet/pods/1e98b98e-a993-4000-90f3-3372541369fb/volumes" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.797311 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-67878d9fbc-npvwk"] Feb 02 10:53:47 crc kubenswrapper[4845]: E0202 10:53:47.797983 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="dnsmasq-dns" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.798081 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="dnsmasq-dns" Feb 02 10:53:47 crc kubenswrapper[4845]: E0202 10:53:47.798148 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="init" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.798232 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="init" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.798513 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="dnsmasq-dns" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.799743 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.801696 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.802698 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.803316 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.812445 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67878d9fbc-npvwk"] Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816479 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-public-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816534 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrb7\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-kube-api-access-mcrb7\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816569 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-config-data\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816650 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-internal-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816726 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-combined-ca-bundle\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816756 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-log-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816790 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-run-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.816846 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-etc-swift\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919112 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-combined-ca-bundle\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919347 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-log-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919443 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-run-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919559 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-etc-swift\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919747 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-public-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919866 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrb7\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-kube-api-access-mcrb7\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.919986 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-config-data\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.920109 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-internal-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.928792 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-public-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.929694 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-log-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.930536 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1d9f4b80-6273-4d77-9309-2ffecc5acc64-run-httpd\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.934234 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-combined-ca-bundle\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.934376 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-internal-tls-certs\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.935450 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d9f4b80-6273-4d77-9309-2ffecc5acc64-config-data\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.944203 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-etc-swift\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.957895 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrb7\" (UniqueName: \"kubernetes.io/projected/1d9f4b80-6273-4d77-9309-2ffecc5acc64-kube-api-access-mcrb7\") pod \"swift-proxy-67878d9fbc-npvwk\" (UID: \"1d9f4b80-6273-4d77-9309-2ffecc5acc64\") " pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:47 crc kubenswrapper[4845]: I0202 10:53:47.995725 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 10:53:48 crc kubenswrapper[4845]: I0202 10:53:48.143485 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:48 crc kubenswrapper[4845]: I0202 10:53:48.894723 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67878d9fbc-npvwk"] Feb 02 10:53:49 crc kubenswrapper[4845]: I0202 10:53:49.521981 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67878d9fbc-npvwk" event={"ID":"1d9f4b80-6273-4d77-9309-2ffecc5acc64","Type":"ContainerStarted","Data":"5cea917cb8a90d8b39a3c28df04d02236b975af47d729fa57e6fca4b89381be8"} Feb 02 10:53:49 crc kubenswrapper[4845]: I0202 10:53:49.522661 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67878d9fbc-npvwk" event={"ID":"1d9f4b80-6273-4d77-9309-2ffecc5acc64","Type":"ContainerStarted","Data":"528266d746d8fd6ae55a22daa001556d0d8858e1559232b3016ca31c80fed2cf"} Feb 02 10:53:49 crc kubenswrapper[4845]: I0202 10:53:49.522683 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67878d9fbc-npvwk" event={"ID":"1d9f4b80-6273-4d77-9309-2ffecc5acc64","Type":"ContainerStarted","Data":"fc0768979a31fc82dc6576fdb675fd4f1aba2309fa75e8c4607d913f063143a6"} Feb 02 10:53:49 crc kubenswrapper[4845]: I0202 10:53:49.522714 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:49 crc kubenswrapper[4845]: I0202 10:53:49.553965 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-67878d9fbc-npvwk" podStartSLOduration=2.553934486 podStartE2EDuration="2.553934486s" podCreationTimestamp="2026-02-02 10:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:49.541102406 +0000 UTC m=+1310.632503856" watchObservedRunningTime="2026-02-02 10:53:49.553934486 +0000 UTC m=+1310.645335956" Feb 02 10:53:50 crc kubenswrapper[4845]: I0202 10:53:50.537695 4845 generic.go:334] "Generic (PLEG): container finished" podID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerID="36d898e83e0f0d7e2fb72b22a68f13dc46338bf7f3eccf52dcd8d9142d72c38d" exitCode=0 Feb 02 10:53:50 crc kubenswrapper[4845]: I0202 10:53:50.538990 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerDied","Data":"36d898e83e0f0d7e2fb72b22a68f13dc46338bf7f3eccf52dcd8d9142d72c38d"} Feb 02 10:53:50 crc kubenswrapper[4845]: I0202 10:53:50.539042 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:50 crc kubenswrapper[4845]: I0202 10:53:50.632935 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-85ff748b95-4vk87" podUID="1e98b98e-a993-4000-90f3-3372541369fb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.203:5353: i/o timeout" Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.683796 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.684433 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-central-agent" containerID="cri-o://fbe9c7e888f7c781c1053770c1b8f9cfc807348795f8c409217a85d5272ec120" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.685008 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" containerID="cri-o://8728f9654023c3d16f466f99002e731e63223d9431b4fa5e2833683a43d715a8" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.685043 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="sg-core" containerID="cri-o://667445270cb8d9e7bc55a98bf683b24cb0b8dad3cdc9fba27ac254a584303435" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.685025 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-notification-agent" containerID="cri-o://cf0af5c5e3f564b6556befcc6a6f25d0bf34be4299c282d46a9c4ef4ba6b6020" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4845]: I0202 10:53:51.712118 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571118 4845 generic.go:334] "Generic (PLEG): container finished" podID="e4430b5f-6421-41e2-b338-3b215c57957a" containerID="8728f9654023c3d16f466f99002e731e63223d9431b4fa5e2833683a43d715a8" exitCode=0 Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571587 4845 generic.go:334] "Generic (PLEG): container finished" podID="e4430b5f-6421-41e2-b338-3b215c57957a" containerID="667445270cb8d9e7bc55a98bf683b24cb0b8dad3cdc9fba27ac254a584303435" exitCode=2 Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571600 4845 generic.go:334] "Generic (PLEG): container finished" podID="e4430b5f-6421-41e2-b338-3b215c57957a" containerID="fbe9c7e888f7c781c1053770c1b8f9cfc807348795f8c409217a85d5272ec120" exitCode=0 Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571219 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerDied","Data":"8728f9654023c3d16f466f99002e731e63223d9431b4fa5e2833683a43d715a8"} Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571660 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerDied","Data":"667445270cb8d9e7bc55a98bf683b24cb0b8dad3cdc9fba27ac254a584303435"} Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.571679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerDied","Data":"fbe9c7e888f7c781c1053770c1b8f9cfc807348795f8c409217a85d5272ec120"} Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.774437 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.776355 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.785520 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.785746 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.786003 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2czql" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.797605 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.904582 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.906943 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.936307 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.958435 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6hs\" (UniqueName: \"kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.958668 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.958805 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.958842 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.976956 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.978724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:52 crc kubenswrapper[4845]: I0202 10:53:52.983442 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.007358 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.067842 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcn2\" (UniqueName: \"kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.067998 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068055 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068082 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068221 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068342 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068385 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068406 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.068950 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6hs\" (UniqueName: \"kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.069420 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.087430 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.088089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.088483 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.095236 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.103165 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.111835 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.113300 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6hs\" (UniqueName: \"kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs\") pod \"heat-engine-567746f76f-zjfmt\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.116405 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.155410 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcn2\" (UniqueName: \"kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173488 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173514 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173545 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjss5\" (UniqueName: \"kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173569 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173601 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173655 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173684 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173699 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.173756 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.174871 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.175431 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.176408 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.176954 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.177012 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.208130 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcn2\" (UniqueName: \"kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2\") pod \"dnsmasq-dns-7756b9d78c-nh2sl\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.243253 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.276734 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.276809 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.276836 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.276928 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c57d\" (UniqueName: \"kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.276984 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.277015 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjss5\" (UniqueName: \"kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.277036 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.277128 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.281160 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.286645 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.292970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.304737 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjss5\" (UniqueName: \"kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5\") pod \"heat-cfnapi-b9b757468-zfd7s\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.315424 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.379477 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.379555 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.379642 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c57d\" (UniqueName: \"kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.379775 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.383873 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.386073 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.386902 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.403821 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c57d\" (UniqueName: \"kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d\") pod \"heat-api-b475b44dc-fr2qw\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.406384 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.586700 4845 generic.go:334] "Generic (PLEG): container finished" podID="e4430b5f-6421-41e2-b338-3b215c57957a" containerID="cf0af5c5e3f564b6556befcc6a6f25d0bf34be4299c282d46a9c4ef4ba6b6020" exitCode=0 Feb 02 10:53:53 crc kubenswrapper[4845]: I0202 10:53:53.586740 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerDied","Data":"cf0af5c5e3f564b6556befcc6a6f25d0bf34be4299c282d46a9c4ef4ba6b6020"} Feb 02 10:53:55 crc kubenswrapper[4845]: I0202 10:53:55.401254 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.206:3000/\": dial tcp 10.217.0.206:3000: connect: connection refused" Feb 02 10:53:56 crc kubenswrapper[4845]: I0202 10:53:56.484981 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:53:56 crc kubenswrapper[4845]: I0202 10:53:56.487745 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-log" containerID="cri-o://1e24bbe2d8cd0583fc986cf6fd412f527daea58b4a85706eb389322bf6ad3af7" gracePeriod=30 Feb 02 10:53:56 crc kubenswrapper[4845]: I0202 10:53:56.488276 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-httpd" containerID="cri-o://5c7e1e0f5ba6836be4b2cc0a23514474d1930ec71f3bf7b3e6b27bbccac7ee40" gracePeriod=30 Feb 02 10:53:56 crc kubenswrapper[4845]: I0202 10:53:56.641949 4845 generic.go:334] "Generic (PLEG): container finished" podID="48aa6807-1e0b-4eab-8255-01c885a24550" containerID="1e24bbe2d8cd0583fc986cf6fd412f527daea58b4a85706eb389322bf6ad3af7" exitCode=143 Feb 02 10:53:56 crc kubenswrapper[4845]: I0202 10:53:56.642003 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerDied","Data":"1e24bbe2d8cd0583fc986cf6fd412f527daea58b4a85706eb389322bf6ad3af7"} Feb 02 10:53:57 crc kubenswrapper[4845]: I0202 10:53:57.963399 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b8db7b6ff-lx6zl" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.069055 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.069684 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c658d9d4-mvn9b" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-api" containerID="cri-o://2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.070340 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c658d9d4-mvn9b" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-httpd" containerID="cri-o://0318987d82017a4372eee65da6e584e622e1a0531f87350923bd3ea8ad37c0e3" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.153766 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.165319 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67878d9fbc-npvwk" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.705247 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.707111 4845 generic.go:334] "Generic (PLEG): container finished" podID="381d0503-4113-48e1-a344-88e990400075" containerID="0318987d82017a4372eee65da6e584e622e1a0531f87350923bd3ea8ad37c0e3" exitCode=0 Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.707187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerDied","Data":"0318987d82017a4372eee65da6e584e622e1a0531f87350923bd3ea8ad37c0e3"} Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.712685 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.712954 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da14c1bc-4bf5-451d-b547-f4695a1f1099","Type":"ContainerDied","Data":"27e0db0e179d31ce7ea0e79507fa9b8ddbc8dd15b66fff98db116d1b86140fed"} Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.713109 4845 scope.go:117] "RemoveContainer" containerID="1fc49f683570faab3ee6de0a4e2c9f7a1e48f5d5ec7c0c827c712793bfb544b5" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.769128 4845 scope.go:117] "RemoveContainer" containerID="36d898e83e0f0d7e2fb72b22a68f13dc46338bf7f3eccf52dcd8d9142d72c38d" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.833639 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.833925 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.833998 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.834075 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.834116 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlb2n\" (UniqueName: \"kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.834178 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts\") pod \"da14c1bc-4bf5-451d-b547-f4695a1f1099\" (UID: \"da14c1bc-4bf5-451d-b547-f4695a1f1099\") " Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.835496 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.839946 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n" (OuterVolumeSpecName: "kube-api-access-mlb2n") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "kube-api-access-mlb2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.845487 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.845663 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts" (OuterVolumeSpecName: "scripts") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.944531 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da14c1bc-4bf5-451d-b547-f4695a1f1099-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.944833 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.944848 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlb2n\" (UniqueName: \"kubernetes.io/projected/da14c1bc-4bf5-451d-b547-f4695a1f1099-kube-api-access-mlb2n\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.944861 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4845]: I0202 10:53:58.990276 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.111170 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148596 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148674 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148747 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148771 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z6ch\" (UniqueName: \"kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148801 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148916 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.148956 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts\") pod \"e4430b5f-6421-41e2-b338-3b215c57957a\" (UID: \"e4430b5f-6421-41e2-b338-3b215c57957a\") " Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.149695 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.149799 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.149960 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.153560 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts" (OuterVolumeSpecName: "scripts") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.154247 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch" (OuterVolumeSpecName: "kube-api-access-7z6ch") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "kube-api-access-7z6ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.188140 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.208687 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data" (OuterVolumeSpecName: "config-data") pod "da14c1bc-4bf5-451d-b547-f4695a1f1099" (UID: "da14c1bc-4bf5-451d-b547-f4695a1f1099"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254394 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da14c1bc-4bf5-451d-b547-f4695a1f1099-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254453 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254468 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254481 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z6ch\" (UniqueName: \"kubernetes.io/projected/e4430b5f-6421-41e2-b338-3b215c57957a-kube-api-access-7z6ch\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254493 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4430b5f-6421-41e2-b338-3b215c57957a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.254504 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.296067 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.298053 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data" (OuterVolumeSpecName: "config-data") pod "e4430b5f-6421-41e2-b338-3b215c57957a" (UID: "e4430b5f-6421-41e2-b338-3b215c57957a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.356956 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.357305 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4430b5f-6421-41e2-b338-3b215c57957a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.375787 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.392581 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.412950 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413595 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-notification-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413620 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-notification-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413636 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="sg-core" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413648 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="sg-core" Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413663 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413670 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413681 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="cinder-scheduler" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413687 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="cinder-scheduler" Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413709 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-central-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413717 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-central-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: E0202 10:53:59.413739 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="probe" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.413748 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="probe" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414011 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-notification-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414039 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="cinder-scheduler" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414054 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="sg-core" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414069 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" containerName="probe" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414089 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="ceilometer-central-agent" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.414102 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" containerName="proxy-httpd" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.415618 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.419090 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.445321 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.462341 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3483568c-cbaa-4f63-94e5-36d1a9534d31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.462674 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-scripts\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.462768 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwlpt\" (UniqueName: \"kubernetes.io/projected/3483568c-cbaa-4f63-94e5-36d1a9534d31-kube-api-access-jwlpt\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.462964 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.463008 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.463154 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.610866 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.713698 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3483568c-cbaa-4f63-94e5-36d1a9534d31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.714264 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3483568c-cbaa-4f63-94e5-36d1a9534d31-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.734897 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-scripts\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.735257 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwlpt\" (UniqueName: \"kubernetes.io/projected/3483568c-cbaa-4f63-94e5-36d1a9534d31-kube-api-access-jwlpt\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.735380 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.735425 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.735509 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.740660 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.742761 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-scripts\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.745778 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.754923 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.763711 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwlpt\" (UniqueName: \"kubernetes.io/projected/3483568c-cbaa-4f63-94e5-36d1a9534d31-kube-api-access-jwlpt\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.768124 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3483568c-cbaa-4f63-94e5-36d1a9534d31-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3483568c-cbaa-4f63-94e5-36d1a9534d31\") " pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.819761 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da14c1bc-4bf5-451d-b547-f4695a1f1099" path="/var/lib/kubelet/pods/da14c1bc-4bf5-451d-b547-f4695a1f1099/volumes" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.821692 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.821970 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c10a41f9-4bda-4d90-81c1-09ed21f00b2b","Type":"ContainerStarted","Data":"07e9c8a05eed62468e720f63d1bfeab6d0ec45c2ff37d7ae78305724817dc630"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.821999 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-567746f76f-zjfmt" event={"ID":"30fbb4bb-1391-411d-adda-a41d223aed00","Type":"ContainerStarted","Data":"2ed051f1edd72eac12419e5ea83d9cdd76f867860395bc32f8a07b0d289f477d"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822012 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4430b5f-6421-41e2-b338-3b215c57957a","Type":"ContainerDied","Data":"253fc8db05e07f1a530e09a4e9ff070908466c9701a4e2cca1a5c237104581b4"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" event={"ID":"2ab561fd-1cd4-43c4-a09d-401ca966b4bb","Type":"ContainerStarted","Data":"250cda2944b48f437289aaa5905e90991c9b8dc1b5cb5593f83b10af7cbf343a"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822040 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822056 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822076 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-658dbb4bcd-qn5fs"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.822105 4845 scope.go:117] "RemoveContainer" containerID="8728f9654023c3d16f466f99002e731e63223d9431b4fa5e2833683a43d715a8" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.829197 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.829232 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-658dbb4bcd-qn5fs"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.829245 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.829771 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.830272 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.834494 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b9b757468-zfd7s" event={"ID":"08e1823d-46cd-40c5-bea1-162473f9a4ce","Type":"ContainerStarted","Data":"99d07802bcf1c0ba9e3fd5b262b073cf214a20ee31d4f7170adba1e4c7702081"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.844276 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b475b44dc-fr2qw" event={"ID":"81ca8604-de7c-4752-8bda-89fccd0c1218","Type":"ContainerStarted","Data":"d41fe68659e16adfad7a630c185917e34c6be9d15f8fc9f1160f015fdca6072a"} Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.846396 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.846640 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.846804 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8jn2\" (UniqueName: \"kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.849206 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.865484 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.866200 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.873643 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.890265 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.908799 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.951077 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.953795 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8jn2\" (UniqueName: \"kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.953834 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.954105 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7dfp\" (UniqueName: \"kubernetes.io/projected/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-kube-api-access-z7dfp\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.955329 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data-custom\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.955445 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.956062 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.956134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-combined-ca-bundle\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.958364 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.073019916 podStartE2EDuration="16.958348646s" podCreationTimestamp="2026-02-02 10:53:43 +0000 UTC" firstStartedPulling="2026-02-02 10:53:44.687088129 +0000 UTC m=+1305.778489579" lastFinishedPulling="2026-02-02 10:53:58.572416859 +0000 UTC m=+1319.663818309" observedRunningTime="2026-02-02 10:53:59.894435095 +0000 UTC m=+1320.985836545" watchObservedRunningTime="2026-02-02 10:53:59.958348646 +0000 UTC m=+1321.049750096" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.963748 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.963761 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.964334 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:53:59 crc kubenswrapper[4845]: I0202 10:53:59.978119 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8jn2\" (UniqueName: \"kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2\") pod \"heat-api-655bdf96f4-zpj7r\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.023670 4845 scope.go:117] "RemoveContainer" containerID="667445270cb8d9e7bc55a98bf683b24cb0b8dad3cdc9fba27ac254a584303435" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.037737 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.048404 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060096 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzds\" (UniqueName: \"kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060218 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060273 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-combined-ca-bundle\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060335 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060435 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060597 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7dfp\" (UniqueName: \"kubernetes.io/projected/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-kube-api-access-z7dfp\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.060671 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data-custom\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.062813 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.066041 4845 scope.go:117] "RemoveContainer" containerID="cf0af5c5e3f564b6556befcc6a6f25d0bf34be4299c282d46a9c4ef4ba6b6020" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.069093 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-combined-ca-bundle\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.069183 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data-custom\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.071915 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-config-data\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.072095 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.076452 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.080494 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.081621 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.084101 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7dfp\" (UniqueName: \"kubernetes.io/projected/6cfd78fb-8f69-43d4-9a58-f7e2f5d27958-kube-api-access-z7dfp\") pod \"heat-engine-658dbb4bcd-qn5fs\" (UID: \"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958\") " pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.120745 4845 scope.go:117] "RemoveContainer" containerID="fbe9c7e888f7c781c1053770c1b8f9cfc807348795f8c409217a85d5272ec120" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165057 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165159 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzds\" (UniqueName: \"kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165256 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165330 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjdd\" (UniqueName: \"kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165364 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165386 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165435 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165457 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165552 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.165663 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.173553 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.173699 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.179580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.193574 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.196870 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzds\" (UniqueName: \"kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds\") pod \"heat-cfnapi-6d94ddcf58-g79zr\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.217687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.240828 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268314 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjdd\" (UniqueName: \"kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268376 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268393 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268425 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.268533 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.271557 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.274786 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.276162 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.276305 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.276871 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.280470 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.293879 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjdd\" (UniqueName: \"kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd\") pod \"ceilometer-0\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.409445 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.799498 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.893423 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-567746f76f-zjfmt" event={"ID":"30fbb4bb-1391-411d-adda-a41d223aed00","Type":"ContainerStarted","Data":"307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93"} Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.902899 4845 generic.go:334] "Generic (PLEG): container finished" podID="48aa6807-1e0b-4eab-8255-01c885a24550" containerID="5c7e1e0f5ba6836be4b2cc0a23514474d1930ec71f3bf7b3e6b27bbccac7ee40" exitCode=0 Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.902975 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerDied","Data":"5c7e1e0f5ba6836be4b2cc0a23514474d1930ec71f3bf7b3e6b27bbccac7ee40"} Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.903013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"48aa6807-1e0b-4eab-8255-01c885a24550","Type":"ContainerDied","Data":"046426bd44987200c7af4b3e2c8d25a4f244d0fd82b246a7123cbf79584728b3"} Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.903027 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="046426bd44987200c7af4b3e2c8d25a4f244d0fd82b246a7123cbf79584728b3" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.920534 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-567746f76f-zjfmt" podStartSLOduration=8.920512271 podStartE2EDuration="8.920512271s" podCreationTimestamp="2026-02-02 10:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:00.911072279 +0000 UTC m=+1322.002473729" watchObservedRunningTime="2026-02-02 10:54:00.920512271 +0000 UTC m=+1322.011913721" Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.922466 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" event={"ID":"2ab561fd-1cd4-43c4-a09d-401ca966b4bb","Type":"ContainerDied","Data":"47eae6a3cd68dd92b0b808c46cdd7b757872d2c3608874a20759716d81ea4849"} Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.922411 4845 generic.go:334] "Generic (PLEG): container finished" podID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerID="47eae6a3cd68dd92b0b808c46cdd7b757872d2c3608874a20759716d81ea4849" exitCode=0 Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.927517 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3483568c-cbaa-4f63-94e5-36d1a9534d31","Type":"ContainerStarted","Data":"33cfaf5e3509b0c0eafd1b0df4a044108c0bac39fdab1e895132c057046c46c5"} Feb 02 10:54:00 crc kubenswrapper[4845]: I0202 10:54:00.960485 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.108187 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.110189 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.110707 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.110913 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.111178 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzv2r\" (UniqueName: \"kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.111214 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.111233 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.111414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"48aa6807-1e0b-4eab-8255-01c885a24550\" (UID: \"48aa6807-1e0b-4eab-8255-01c885a24550\") " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.113821 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.120516 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs" (OuterVolumeSpecName: "logs") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.130936 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.135316 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r" (OuterVolumeSpecName: "kube-api-access-nzv2r") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "kube-api-access-nzv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.141704 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts" (OuterVolumeSpecName: "scripts") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.169745 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-658dbb4bcd-qn5fs"] Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.198334 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.212228 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7" (OuterVolumeSpecName: "glance") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "pvc-a16f116c-8f63-4ae9-a645-587add90fda7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215315 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzv2r\" (UniqueName: \"kubernetes.io/projected/48aa6807-1e0b-4eab-8255-01c885a24550-kube-api-access-nzv2r\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215342 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215370 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") on node \"crc\" " Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215381 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/48aa6807-1e0b-4eab-8255-01c885a24550-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215393 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.215403 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.321742 4845 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.322510 4845 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a16f116c-8f63-4ae9-a645-587add90fda7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7") on node "crc" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.371451 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.383813 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.393284 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data" (OuterVolumeSpecName: "config-data") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.405129 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48aa6807-1e0b-4eab-8255-01c885a24550" (UID: "48aa6807-1e0b-4eab-8255-01c885a24550"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.419772 4845 reconciler_common.go:293] "Volume detached for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.419806 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.419816 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48aa6807-1e0b-4eab-8255-01c885a24550-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.738725 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4430b5f-6421-41e2-b338-3b215c57957a" path="/var/lib/kubelet/pods/e4430b5f-6421-41e2-b338-3b215c57957a/volumes" Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.968005 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" event={"ID":"06e2175d-c446-4586-a3cb-e5819314abfe","Type":"ContainerStarted","Data":"5e9e84a85d37d5a663e5d2555b2bd6af1b0b56601697215185d84529a118289f"} Feb 02 10:54:01 crc kubenswrapper[4845]: I0202 10:54:01.971195 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-655bdf96f4-zpj7r" event={"ID":"1b30c63b-6407-4f1d-a393-c4f33a758db8","Type":"ContainerStarted","Data":"1958eba93e5442b126732bbc2709ccb6a5ad1ffec89971c483a8a27cda81d546"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.001506 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerStarted","Data":"3ed9e63dbb2ac681fa2a644a55a08157b6bb091be1c082086d50bfa19e192457"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.017145 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" event={"ID":"2ab561fd-1cd4-43c4-a09d-401ca966b4bb","Type":"ContainerStarted","Data":"f6562082a819ade2f17a46123a0743170963965766d739159cebc380dae9a85f"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.017924 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.026514 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3483568c-cbaa-4f63-94e5-36d1a9534d31","Type":"ContainerStarted","Data":"ff92ba859349704e56aa88a5d47c5542e780357aabff232513a8a1649b5f1a5e"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.033298 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.033299 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-658dbb4bcd-qn5fs" event={"ID":"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958","Type":"ContainerStarted","Data":"1ed2f6f188a24c838bbd168de4e2c1b317808190e41810b07cb51f63eaca0ef4"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.033448 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-658dbb4bcd-qn5fs" event={"ID":"6cfd78fb-8f69-43d4-9a58-f7e2f5d27958","Type":"ContainerStarted","Data":"56b7d4d125ac2aad8ac2eaecdd09195080a7b3a15aedeb0e8b2bfa938a97c790"} Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.033785 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.059645 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" podStartSLOduration=10.059623671 podStartE2EDuration="10.059623671s" podCreationTimestamp="2026-02-02 10:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:02.041026926 +0000 UTC m=+1323.132428406" watchObservedRunningTime="2026-02-02 10:54:02.059623671 +0000 UTC m=+1323.151025131" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.070136 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-658dbb4bcd-qn5fs" podStartSLOduration=3.070113763 podStartE2EDuration="3.070113763s" podCreationTimestamp="2026-02-02 10:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:02.06410051 +0000 UTC m=+1323.155501960" watchObservedRunningTime="2026-02-02 10:54:02.070113763 +0000 UTC m=+1323.161515213" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.136328 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.208720 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.223067 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:02 crc kubenswrapper[4845]: E0202 10:54:02.223586 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-log" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.223598 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-log" Feb 02 10:54:02 crc kubenswrapper[4845]: E0202 10:54:02.223649 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-httpd" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.223656 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-httpd" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.223875 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-log" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.223991 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" containerName="glance-httpd" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.225357 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.230490 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.230744 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.236646 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342014 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342088 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342164 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342201 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342250 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342289 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342347 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f65zv\" (UniqueName: \"kubernetes.io/projected/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-kube-api-access-f65zv\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.342423 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444594 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444685 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444718 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444775 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444802 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444831 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444861 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.444921 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f65zv\" (UniqueName: \"kubernetes.io/projected/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-kube-api-access-f65zv\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.446363 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-logs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.446668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.460356 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.460407 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ed60d0e4f5ad2fb51e67cadb4519054184ad51c31b402d173121c9411d32387/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.461762 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.462603 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-config-data\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.468385 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.468500 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-scripts\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.481856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f65zv\" (UniqueName: \"kubernetes.io/projected/80eee60d-7cee-4b29-b022-9f5e8e5d6bdb-kube-api-access-f65zv\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.542486 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a16f116c-8f63-4ae9-a645-587add90fda7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a16f116c-8f63-4ae9-a645-587add90fda7\") pod \"glance-default-external-api-0\" (UID: \"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb\") " pod="openstack/glance-default-external-api-0" Feb 02 10:54:02 crc kubenswrapper[4845]: I0202 10:54:02.566587 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.058593 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3483568c-cbaa-4f63-94e5-36d1a9534d31","Type":"ContainerStarted","Data":"e3ffefa6e81103866221c0a24be7a59af18fdf6a36510dd2c2fd0a3099d37b45"} Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.070875 4845 generic.go:334] "Generic (PLEG): container finished" podID="381d0503-4113-48e1-a344-88e990400075" containerID="2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6" exitCode=0 Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.071943 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerDied","Data":"2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6"} Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.072759 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.115377 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.115354341 podStartE2EDuration="4.115354341s" podCreationTimestamp="2026-02-02 10:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:03.094419658 +0000 UTC m=+1324.185821108" watchObservedRunningTime="2026-02-02 10:54:03.115354341 +0000 UTC m=+1324.206755791" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.138019 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.168127 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7998b4fc87-n5g2f"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.170347 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.173733 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.174436 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.182037 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.198620 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7998b4fc87-n5g2f"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.251351 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6dcccd9c6c-tq64l"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.264406 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.265425 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6dcccd9c6c-tq64l"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.267062 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.267220 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.276764 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-public-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.278183 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data-custom\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.278328 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-combined-ca-bundle\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.278412 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.278508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2rb\" (UniqueName: \"kubernetes.io/projected/7dfab927-78ef-4105-a07b-a109690fda89-kube-api-access-dx2rb\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.278703 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-internal-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383012 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-public-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383079 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data-custom\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383121 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data-custom\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383142 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-combined-ca-bundle\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383179 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383209 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-public-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383247 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2rb\" (UniqueName: \"kubernetes.io/projected/7dfab927-78ef-4105-a07b-a109690fda89-kube-api-access-dx2rb\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383278 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383313 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-internal-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383357 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4jbf\" (UniqueName: \"kubernetes.io/projected/1225250d-8a00-47d3-acea-856fa864dff5-kube-api-access-m4jbf\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383384 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-combined-ca-bundle\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.383406 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-internal-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.390386 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-public-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.390967 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-internal-tls-certs\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.391876 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.391911 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-combined-ca-bundle\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.395945 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dfab927-78ef-4105-a07b-a109690fda89-config-data-custom\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.404354 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2rb\" (UniqueName: \"kubernetes.io/projected/7dfab927-78ef-4105-a07b-a109690fda89-kube-api-access-dx2rb\") pod \"heat-api-7998b4fc87-n5g2f\" (UID: \"7dfab927-78ef-4105-a07b-a109690fda89\") " pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.485901 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data-custom\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.485974 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-public-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.486027 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.486086 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-internal-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.486151 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4jbf\" (UniqueName: \"kubernetes.io/projected/1225250d-8a00-47d3-acea-856fa864dff5-kube-api-access-m4jbf\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.486182 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-combined-ca-bundle\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.490982 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-internal-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.491348 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.492863 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-public-tls-certs\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.493569 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-combined-ca-bundle\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.497122 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.498652 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1225250d-8a00-47d3-acea-856fa864dff5-config-data-custom\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.528361 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4jbf\" (UniqueName: \"kubernetes.io/projected/1225250d-8a00-47d3-acea-856fa864dff5-kube-api-access-m4jbf\") pod \"heat-cfnapi-6dcccd9c6c-tq64l\" (UID: \"1225250d-8a00-47d3-acea-856fa864dff5\") " pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.569026 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.569256 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-log" containerID="cri-o://4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd" gracePeriod=30 Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.569713 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-httpd" containerID="cri-o://e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4" gracePeriod=30 Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.612861 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:03 crc kubenswrapper[4845]: I0202 10:54:03.739641 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48aa6807-1e0b-4eab-8255-01c885a24550" path="/var/lib/kubelet/pods/48aa6807-1e0b-4eab-8255-01c885a24550/volumes" Feb 02 10:54:04 crc kubenswrapper[4845]: I0202 10:54:04.098289 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f69079e-af81-421c-870a-2a08c1b2420e" containerID="4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd" exitCode=143 Feb 02 10:54:04 crc kubenswrapper[4845]: I0202 10:54:04.098381 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerDied","Data":"4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd"} Feb 02 10:54:04 crc kubenswrapper[4845]: I0202 10:54:04.866634 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.398434 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.554821 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc9m4\" (UniqueName: \"kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4\") pod \"381d0503-4113-48e1-a344-88e990400075\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.554934 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config\") pod \"381d0503-4113-48e1-a344-88e990400075\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.554984 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config\") pod \"381d0503-4113-48e1-a344-88e990400075\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.555107 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle\") pod \"381d0503-4113-48e1-a344-88e990400075\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.555354 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs\") pod \"381d0503-4113-48e1-a344-88e990400075\" (UID: \"381d0503-4113-48e1-a344-88e990400075\") " Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.594216 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "381d0503-4113-48e1-a344-88e990400075" (UID: "381d0503-4113-48e1-a344-88e990400075"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.608767 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7998b4fc87-n5g2f"] Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.629639 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4" (OuterVolumeSpecName: "kube-api-access-sc9m4") pod "381d0503-4113-48e1-a344-88e990400075" (UID: "381d0503-4113-48e1-a344-88e990400075"). InnerVolumeSpecName "kube-api-access-sc9m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.639095 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.659018 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc9m4\" (UniqueName: \"kubernetes.io/projected/381d0503-4113-48e1-a344-88e990400075-kube-api-access-sc9m4\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.659052 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.790088 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6dcccd9c6c-tq64l"] Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.911386 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:05 crc kubenswrapper[4845]: I0202 10:54:05.996325 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "381d0503-4113-48e1-a344-88e990400075" (UID: "381d0503-4113-48e1-a344-88e990400075"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.074170 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.107030 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "381d0503-4113-48e1-a344-88e990400075" (UID: "381d0503-4113-48e1-a344-88e990400075"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.127688 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config" (OuterVolumeSpecName: "config") pod "381d0503-4113-48e1-a344-88e990400075" (UID: "381d0503-4113-48e1-a344-88e990400075"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.176461 4845 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.176501 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/381d0503-4113-48e1-a344-88e990400075-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.191253 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7998b4fc87-n5g2f" event={"ID":"7dfab927-78ef-4105-a07b-a109690fda89","Type":"ContainerStarted","Data":"d579eeff24b4fe2a30800b3904deea4e676f86ceaf721aba5f0d2b2af4914cd5"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.203189 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b9b757468-zfd7s" event={"ID":"08e1823d-46cd-40c5-bea1-162473f9a4ce","Type":"ContainerStarted","Data":"9eb0cc21db22e7b0a6e9194e496c67f804362485d557f665096269f5e604e637"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.203385 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-b9b757468-zfd7s" podUID="08e1823d-46cd-40c5-bea1-162473f9a4ce" containerName="heat-cfnapi" containerID="cri-o://9eb0cc21db22e7b0a6e9194e496c67f804362485d557f665096269f5e604e637" gracePeriod=60 Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.204146 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.224023 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-b9b757468-zfd7s" podStartSLOduration=8.85305982 podStartE2EDuration="14.224003103s" podCreationTimestamp="2026-02-02 10:53:52 +0000 UTC" firstStartedPulling="2026-02-02 10:53:59.582508894 +0000 UTC m=+1320.673910344" lastFinishedPulling="2026-02-02 10:54:04.953452177 +0000 UTC m=+1326.044853627" observedRunningTime="2026-02-02 10:54:06.221921463 +0000 UTC m=+1327.313322933" watchObservedRunningTime="2026-02-02 10:54:06.224003103 +0000 UTC m=+1327.315404553" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.247568 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c658d9d4-mvn9b" event={"ID":"381d0503-4113-48e1-a344-88e990400075","Type":"ContainerDied","Data":"e0403804c19c7d887fabdb04dd94aa27e4ca3026135843fda8f93cf69fb0c8cd"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.247643 4845 scope.go:117] "RemoveContainer" containerID="0318987d82017a4372eee65da6e584e622e1a0531f87350923bd3ea8ad37c0e3" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.247832 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c658d9d4-mvn9b" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.254240 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" event={"ID":"1225250d-8a00-47d3-acea-856fa864dff5","Type":"ContainerStarted","Data":"f9eb7854a008987a6893bb48b3ca1c67ab3968d1bd26c03725333564ff086950"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.274673 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b475b44dc-fr2qw" event={"ID":"81ca8604-de7c-4752-8bda-89fccd0c1218","Type":"ContainerStarted","Data":"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.274859 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-b475b44dc-fr2qw" podUID="81ca8604-de7c-4752-8bda-89fccd0c1218" containerName="heat-api" containerID="cri-o://e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda" gracePeriod=60 Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.274958 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.291971 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" event={"ID":"06e2175d-c446-4586-a3cb-e5819314abfe","Type":"ContainerStarted","Data":"b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.295186 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.312703 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-b475b44dc-fr2qw" podStartSLOduration=8.991389053 podStartE2EDuration="14.312678616s" podCreationTimestamp="2026-02-02 10:53:52 +0000 UTC" firstStartedPulling="2026-02-02 10:53:59.659230414 +0000 UTC m=+1320.750631864" lastFinishedPulling="2026-02-02 10:54:04.980519977 +0000 UTC m=+1326.071921427" observedRunningTime="2026-02-02 10:54:06.303916514 +0000 UTC m=+1327.395317964" watchObservedRunningTime="2026-02-02 10:54:06.312678616 +0000 UTC m=+1327.404080066" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.318480 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb","Type":"ContainerStarted","Data":"caa6241387363604f43a7866e079962e35eb6d7bf4fe0b831783629994d1d233"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.334492 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerStarted","Data":"65f19a5e53861afad5be0dc22811e405a7089640d2466a967d0205688cbae9c3"} Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.383082 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" podStartSLOduration=3.663436359 podStartE2EDuration="7.383054983s" podCreationTimestamp="2026-02-02 10:53:59 +0000 UTC" firstStartedPulling="2026-02-02 10:54:01.397150326 +0000 UTC m=+1322.488551776" lastFinishedPulling="2026-02-02 10:54:05.11676895 +0000 UTC m=+1326.208170400" observedRunningTime="2026-02-02 10:54:06.33434078 +0000 UTC m=+1327.425742230" watchObservedRunningTime="2026-02-02 10:54:06.383054983 +0000 UTC m=+1327.474456433" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.620052 4845 scope.go:117] "RemoveContainer" containerID="2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6" Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.620622 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:54:06 crc kubenswrapper[4845]: I0202 10:54:06.635536 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c658d9d4-mvn9b"] Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.365035 4845 generic.go:334] "Generic (PLEG): container finished" podID="7f69079e-af81-421c-870a-2a08c1b2420e" containerID="e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4" exitCode=0 Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.365310 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerDied","Data":"e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.372342 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" event={"ID":"1225250d-8a00-47d3-acea-856fa864dff5","Type":"ContainerStarted","Data":"58e59e68a691afa56c0ba5b9ffddbdb9d05bb3d4917cb33aef8779ee193c6080"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.373388 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.391971 4845 generic.go:334] "Generic (PLEG): container finished" podID="06e2175d-c446-4586-a3cb-e5819314abfe" containerID="b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59" exitCode=1 Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.392055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" event={"ID":"06e2175d-c446-4586-a3cb-e5819314abfe","Type":"ContainerDied","Data":"b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.392826 4845 scope.go:117] "RemoveContainer" containerID="b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.398354 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" podStartSLOduration=4.398340617 podStartE2EDuration="4.398340617s" podCreationTimestamp="2026-02-02 10:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:07.397237515 +0000 UTC m=+1328.488638985" watchObservedRunningTime="2026-02-02 10:54:07.398340617 +0000 UTC m=+1328.489742067" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.406111 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerID="7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995" exitCode=1 Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.406212 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-655bdf96f4-zpj7r" event={"ID":"1b30c63b-6407-4f1d-a393-c4f33a758db8","Type":"ContainerDied","Data":"7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.407052 4845 scope.go:117] "RemoveContainer" containerID="7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.433166 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb","Type":"ContainerStarted","Data":"56081e1e9570ba420135229155c0ec53a579472990c940c108f334aa79c5a2cb"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.506174 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerStarted","Data":"1e8465847af87f5185fe9a371926a2dffc326ca101ce721ffdfe10ea1d45b3b5"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.508317 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7998b4fc87-n5g2f" event={"ID":"7dfab927-78ef-4105-a07b-a109690fda89","Type":"ContainerStarted","Data":"e1229872b3f4bfa1ee396eba8f47d9c69760c36de7fa47ea153df19a95651718"} Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.510029 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.563378 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7998b4fc87-n5g2f" podStartSLOduration=4.563354848 podStartE2EDuration="4.563354848s" podCreationTimestamp="2026-02-02 10:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:07.545088332 +0000 UTC m=+1328.636489792" watchObservedRunningTime="2026-02-02 10:54:07.563354848 +0000 UTC m=+1328.654756298" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.767643 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381d0503-4113-48e1-a344-88e990400075" path="/var/lib/kubelet/pods/381d0503-4113-48e1-a344-88e990400075/volumes" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.820914 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.928722 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.928784 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.928991 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.929135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.929171 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.929190 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.929241 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.929268 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7xq\" (UniqueName: \"kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.932758 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.933115 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs" (OuterVolumeSpecName: "logs") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.938699 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts" (OuterVolumeSpecName: "scripts") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.951026 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq" (OuterVolumeSpecName: "kube-api-access-cl7xq") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "kube-api-access-cl7xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:07 crc kubenswrapper[4845]: I0202 10:54:07.990127 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995" (OuterVolumeSpecName: "glance") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.055606 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7xq\" (UniqueName: \"kubernetes.io/projected/7f69079e-af81-421c-870a-2a08c1b2420e-kube-api-access-cl7xq\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.055658 4845 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") on node \"crc\" " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.055674 4845 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.055685 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.055696 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f69079e-af81-421c-870a-2a08c1b2420e-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.059787 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: E0202 10:54:08.092702 4845 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs podName:7f69079e-af81-421c-870a-2a08c1b2420e nodeName:}" failed. No retries permitted until 2026-02-02 10:54:08.59266454 +0000 UTC m=+1329.684065990 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "internal-tls-certs" (UniqueName: "kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e") : error deleting /var/lib/kubelet/pods/7f69079e-af81-421c-870a-2a08c1b2420e/volume-subpaths: remove /var/lib/kubelet/pods/7f69079e-af81-421c-870a-2a08c1b2420e/volume-subpaths: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.100247 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data" (OuterVolumeSpecName: "config-data") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.134531 4845 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.134734 4845 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995") on node "crc" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.157804 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.158084 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.158179 4845 reconciler_common.go:293] "Volume detached for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.245152 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.337537 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.341207 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="dnsmasq-dns" containerID="cri-o://e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2" gracePeriod=10 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.591979 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerStarted","Data":"3ebbd0d0a7130e66b17b81004ea1929c3e2d40fef4c6767da3df2300291382c2"} Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.605739 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ca8604_de7c_4752_8bda_89fccd0c1218.slice/crio-conmon-e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ca8604_de7c_4752_8bda_89fccd0c1218.slice/crio-conmon-e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.605783 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ca8604_de7c_4752_8bda_89fccd0c1218.slice/crio-e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ca8604_de7c_4752_8bda_89fccd0c1218.slice/crio-e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.607416 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.607831 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f69079e-af81-421c-870a-2a08c1b2420e","Type":"ContainerDied","Data":"4d7f9cbfa39969e34e788860572765e355857e65be648d766beb851dfaf208a7"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.607895 4845 scope.go:117] "RemoveContainer" containerID="e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.608007 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.608956 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-conmon-b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-conmon-b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.609001 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.609032 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-conmon-7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-conmon-7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.611574 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.620120 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-conmon-2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-conmon-2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.620380 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-conmon-515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-conmon-515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.620463 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06e2175d_c446_4586_a3cb_e5819314abfe.slice/crio-2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: W0202 10:54:08.620654 4845 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30c63b_6407_4f1d_a393_c4f33a758db8.slice/crio-515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8.scope: no such file or directory Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.630516 4845 generic.go:334] "Generic (PLEG): container finished" podID="06105adf-bd97-410f-922f-cb54a637955d" containerID="ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709" exitCode=137 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.630606 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerDied","Data":"ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.658073 4845 generic.go:334] "Generic (PLEG): container finished" podID="81ca8604-de7c-4752-8bda-89fccd0c1218" containerID="e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda" exitCode=0 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.658188 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b475b44dc-fr2qw" event={"ID":"81ca8604-de7c-4752-8bda-89fccd0c1218","Type":"ContainerDied","Data":"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.658910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-b475b44dc-fr2qw" event={"ID":"81ca8604-de7c-4752-8bda-89fccd0c1218","Type":"ContainerDied","Data":"d41fe68659e16adfad7a630c185917e34c6be9d15f8fc9f1160f015fdca6072a"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.659015 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-b475b44dc-fr2qw" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.672782 4845 generic.go:334] "Generic (PLEG): container finished" podID="06e2175d-c446-4586-a3cb-e5819314abfe" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" exitCode=1 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.672843 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" event={"ID":"06e2175d-c446-4586-a3cb-e5819314abfe","Type":"ContainerDied","Data":"2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.673661 4845 scope.go:117] "RemoveContainer" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" Feb 02 10:54:08 crc kubenswrapper[4845]: E0202 10:54:08.674017 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6d94ddcf58-g79zr_openstack(06e2175d-c446-4586-a3cb-e5819314abfe)\"" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.674968 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle\") pod \"81ca8604-de7c-4752-8bda-89fccd0c1218\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.675063 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data\") pod \"81ca8604-de7c-4752-8bda-89fccd0c1218\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.675341 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c57d\" (UniqueName: \"kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d\") pod \"81ca8604-de7c-4752-8bda-89fccd0c1218\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.675405 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") pod \"7f69079e-af81-421c-870a-2a08c1b2420e\" (UID: \"7f69079e-af81-421c-870a-2a08c1b2420e\") " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.675444 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom\") pod \"81ca8604-de7c-4752-8bda-89fccd0c1218\" (UID: \"81ca8604-de7c-4752-8bda-89fccd0c1218\") " Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.702473 4845 generic.go:334] "Generic (PLEG): container finished" podID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerID="e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2" exitCode=0 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.702524 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" event={"ID":"f57453e0-7229-4521-9d4a-769dc8c888fa","Type":"ContainerDied","Data":"e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.705955 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d" (OuterVolumeSpecName: "kube-api-access-7c57d") pod "81ca8604-de7c-4752-8bda-89fccd0c1218" (UID: "81ca8604-de7c-4752-8bda-89fccd0c1218"). InnerVolumeSpecName "kube-api-access-7c57d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.710093 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "81ca8604-de7c-4752-8bda-89fccd0c1218" (UID: "81ca8604-de7c-4752-8bda-89fccd0c1218"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.729744 4845 scope.go:117] "RemoveContainer" containerID="4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.744085 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f69079e-af81-421c-870a-2a08c1b2420e" (UID: "7f69079e-af81-421c-870a-2a08c1b2420e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.748296 4845 generic.go:334] "Generic (PLEG): container finished" podID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" exitCode=1 Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.748473 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-655bdf96f4-zpj7r" event={"ID":"1b30c63b-6407-4f1d-a393-c4f33a758db8","Type":"ContainerDied","Data":"515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.750187 4845 scope.go:117] "RemoveContainer" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" Feb 02 10:54:08 crc kubenswrapper[4845]: E0202 10:54:08.750476 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-655bdf96f4-zpj7r_openstack(1b30c63b-6407-4f1d-a393-c4f33a758db8)\"" pod="openstack/heat-api-655bdf96f4-zpj7r" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.773974 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80eee60d-7cee-4b29-b022-9f5e8e5d6bdb","Type":"ContainerStarted","Data":"e8d6d33865830ba7c7e5872dbdebead945559e5fed30e6a6855364535c7f2acd"} Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.780840 4845 scope.go:117] "RemoveContainer" containerID="e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.787497 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ca8604-de7c-4752-8bda-89fccd0c1218" (UID: "81ca8604-de7c-4752-8bda-89fccd0c1218"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.787526 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c57d\" (UniqueName: \"kubernetes.io/projected/81ca8604-de7c-4752-8bda-89fccd0c1218-kube-api-access-7c57d\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.787545 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f69079e-af81-421c-870a-2a08c1b2420e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.787554 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.819835 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data" (OuterVolumeSpecName: "config-data") pod "81ca8604-de7c-4752-8bda-89fccd0c1218" (UID: "81ca8604-de7c-4752-8bda-89fccd0c1218"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.827373 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.827351895 podStartE2EDuration="6.827351895s" podCreationTimestamp="2026-02-02 10:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:08.807262226 +0000 UTC m=+1329.898663676" watchObservedRunningTime="2026-02-02 10:54:08.827351895 +0000 UTC m=+1329.918753345" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.843341 4845 scope.go:117] "RemoveContainer" containerID="e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda" Feb 02 10:54:08 crc kubenswrapper[4845]: E0202 10:54:08.845014 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda\": container with ID starting with e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda not found: ID does not exist" containerID="e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.845078 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda"} err="failed to get container status \"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda\": rpc error: code = NotFound desc = could not find container \"e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda\": container with ID starting with e4ef4309ad042f4deb3e676cecfdbf7fbe26cbd2cdfa1e7701941c8c44343dda not found: ID does not exist" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.845111 4845 scope.go:117] "RemoveContainer" containerID="b6d861f3d4f1122d7cbdb51df30dffc6c15caaf7e9adea6c717cc462a2d36e59" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.890388 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: I0202 10:54:08.890430 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ca8604-de7c-4752-8bda-89fccd0c1218-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:08 crc kubenswrapper[4845]: E0202 10:54:08.940269 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381d0503_4113_48e1_a344_88e990400075.slice/crio-conmon-2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381d0503_4113_48e1_a344_88e990400075.slice/crio-e0403804c19c7d887fabdb04dd94aa27e4ca3026135843fda8f93cf69fb0c8cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f69079e_af81_421c_870a_2a08c1b2420e.slice/crio-e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06105adf_bd97_410f_922f_cb54a637955d.slice/crio-conmon-ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f69079e_af81_421c_870a_2a08c1b2420e.slice/crio-conmon-4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48aa6807_1e0b_4eab_8255_01c885a24550.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381d0503_4113_48e1_a344_88e990400075.slice/crio-2ea8dbd44d235dcde26b7387e5ca4d94d64fb20bb5d89d73c2a48c90da6ef1d6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f69079e_af81_421c_870a_2a08c1b2420e.slice/crio-conmon-e01c6288e6f46f1f5e6d28589b3c379a1a0749fccb437477b6b9e2e2597123a4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf57453e0_7229_4521_9d4a_769dc8c888fa.slice/crio-e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f69079e_af81_421c_870a_2a08c1b2420e.slice/crio-4f20fb73216b64db7b3a8f01b837e3181d5ffdb5d4d8bf409ca31ea2e79b0bcd.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.060657 4845 scope.go:117] "RemoveContainer" containerID="7c3cfa22f88c71170abec828b19bef4d61f6d07bacfd495c9500d6b092c37995" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.316043 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.325690 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.337689 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.340751 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341532 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341562 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q98wr\" (UniqueName: \"kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341782 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341897 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341921 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.341994 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb\") pod \"f57453e0-7229-4521-9d4a-769dc8c888fa\" (UID: \"f57453e0-7229-4521-9d4a-769dc8c888fa\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.350240 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr" (OuterVolumeSpecName: "kube-api-access-q98wr") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "kube-api-access-q98wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.450414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8jww\" (UniqueName: \"kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.450609 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.450873 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.450984 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.451009 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.451610 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q98wr\" (UniqueName: \"kubernetes.io/projected/f57453e0-7229-4521-9d4a-769dc8c888fa-kube-api-access-q98wr\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.459218 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.476561 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.487842 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.488252 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww" (OuterVolumeSpecName: "kube-api-access-k8jww") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "kube-api-access-k8jww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.488764 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts" (OuterVolumeSpecName: "scripts") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.503968 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.517750 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config" (OuterVolumeSpecName: "config") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.547928 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-b475b44dc-fr2qw"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.549080 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.550379 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.553356 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.553582 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id\") pod \"06105adf-bd97-410f-922f-cb54a637955d\" (UID: \"06105adf-bd97-410f-922f-cb54a637955d\") " Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.558251 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.561852 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.563384 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.563557 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.563666 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.563747 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.562387 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs" (OuterVolumeSpecName: "logs") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.563953 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.564039 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.564144 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8jww\" (UniqueName: \"kubernetes.io/projected/06105adf-bd97-410f-922f-cb54a637955d-kube-api-access-k8jww\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.572818 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573266 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573282 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-api" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573295 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-log" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573301 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-log" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573322 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="init" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573329 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="init" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573348 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ca8604-de7c-4752-8bda-89fccd0c1218" containerName="heat-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573354 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ca8604-de7c-4752-8bda-89fccd0c1218" containerName="heat-api" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573367 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573372 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573383 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="dnsmasq-dns" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573388 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="dnsmasq-dns" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573395 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573400 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573415 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573420 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.573439 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api-log" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573444 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api-log" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573697 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573712 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" containerName="dnsmasq-dns" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573720 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="06105adf-bd97-410f-922f-cb54a637955d" containerName="cinder-api-log" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573729 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573735 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573742 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ca8604-de7c-4752-8bda-89fccd0c1218" containerName="heat-api" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573757 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="381d0503-4113-48e1-a344-88e990400075" containerName="neutron-httpd" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.573777 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" containerName="glance-log" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.576412 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.578557 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.578762 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.582955 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data" (OuterVolumeSpecName: "config-data") pod "06105adf-bd97-410f-922f-cb54a637955d" (UID: "06105adf-bd97-410f-922f-cb54a637955d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.609785 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.611343 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f57453e0-7229-4521-9d4a-769dc8c888fa" (UID: "f57453e0-7229-4521-9d4a-769dc8c888fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.686871 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f57453e0-7229-4521-9d4a-769dc8c888fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.687225 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06105adf-bd97-410f-922f-cb54a637955d-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.687242 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06105adf-bd97-410f-922f-cb54a637955d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.687251 4845 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06105adf-bd97-410f-922f-cb54a637955d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.740364 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f69079e-af81-421c-870a-2a08c1b2420e" path="/var/lib/kubelet/pods/7f69079e-af81-421c-870a-2a08c1b2420e/volumes" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.741467 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ca8604-de7c-4752-8bda-89fccd0c1218" path="/var/lib/kubelet/pods/81ca8604-de7c-4752-8bda-89fccd0c1218/volumes" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.788800 4845 scope.go:117] "RemoveContainer" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.789370 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6d94ddcf58-g79zr_openstack(06e2175d-c446-4586-a3cb-e5819314abfe)\"" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.792741 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.792834 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.792983 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.793165 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.793279 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.793749 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbwxq\" (UniqueName: \"kubernetes.io/projected/fbdeff72-81f9-4063-8704-d97b21e01b82-kube-api-access-sbwxq\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.793810 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.793840 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.794154 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.794251 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-qspp8" event={"ID":"f57453e0-7229-4521-9d4a-769dc8c888fa","Type":"ContainerDied","Data":"4772888bdcd0116c54162c4f207b2005eb03b8a93b86ecf4467b09a24d45e538"} Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.794332 4845 scope.go:117] "RemoveContainer" containerID="e5275c3566c176f1596ba660dd53acc1de661d183bfb3ade1e8619590805afd2" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.798865 4845 scope.go:117] "RemoveContainer" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" Feb 02 10:54:09 crc kubenswrapper[4845]: E0202 10:54:09.799241 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-655bdf96f4-zpj7r_openstack(1b30c63b-6407-4f1d-a393-c4f33a758db8)\"" pod="openstack/heat-api-655bdf96f4-zpj7r" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.813377 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.813492 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06105adf-bd97-410f-922f-cb54a637955d","Type":"ContainerDied","Data":"10b5879f559c2a482eb1195a6f65f86b7381b0fca31a6c70d7384ff0ddf1d778"} Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.859081 4845 scope.go:117] "RemoveContainer" containerID="de805387ebe60937953bfa8ca82aa39520aa199cff779fd5003f9f946eb62840" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.876818 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897281 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbwxq\" (UniqueName: \"kubernetes.io/projected/fbdeff72-81f9-4063-8704-d97b21e01b82-kube-api-access-sbwxq\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897325 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897382 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897409 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897470 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897525 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.897611 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.898213 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.898966 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbdeff72-81f9-4063-8704-d97b21e01b82-logs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.904221 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-qspp8"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.906797 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.907491 4845 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.907528 4845 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a65ad887e305c92ac1c8235cc9c5fc327f1ea7ce91b9974356e11ee00bc2f81/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.908250 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.921660 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.925411 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbdeff72-81f9-4063-8704-d97b21e01b82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.934977 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.943307 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbwxq\" (UniqueName: \"kubernetes.io/projected/fbdeff72-81f9-4063-8704-d97b21e01b82-kube-api-access-sbwxq\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.967136 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.968708 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6fba07dd-e1ed-404b-8436-a4d0a8acd995\") pod \"glance-default-internal-api-0\" (UID: \"fbdeff72-81f9-4063-8704-d97b21e01b82\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.979135 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.981087 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.984130 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.985500 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:54:09 crc kubenswrapper[4845]: I0202 10:54:09.985716 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.000051 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.106784 4845 scope.go:117] "RemoveContainer" containerID="ddf1463358b0ba98c5bf9d38609a756d1bcc25cada5f4b3714a84897700e8709" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109437 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r7hn\" (UniqueName: \"kubernetes.io/projected/1800fe94-c9b9-4a5a-963a-75d82a4eab94-kube-api-access-8r7hn\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109498 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109627 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1800fe94-c9b9-4a5a-963a-75d82a4eab94-logs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109696 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1800fe94-c9b9-4a5a-963a-75d82a4eab94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109715 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109750 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109797 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109899 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data-custom\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.109949 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-scripts\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.159575 4845 scope.go:117] "RemoveContainer" containerID="0b451f378e885ddb27d4b9a42dcd386c2a29d3bf05ddbf776583d1ff3fa31571" Feb 02 10:54:10 crc kubenswrapper[4845]: E0202 10:54:10.192662 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.211864 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.211961 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data-custom\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-scripts\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212145 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r7hn\" (UniqueName: \"kubernetes.io/projected/1800fe94-c9b9-4a5a-963a-75d82a4eab94-kube-api-access-8r7hn\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212180 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212290 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1800fe94-c9b9-4a5a-963a-75d82a4eab94-logs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212369 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1800fe94-c9b9-4a5a-963a-75d82a4eab94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212400 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.212445 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.213044 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1800fe94-c9b9-4a5a-963a-75d82a4eab94-logs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.213201 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1800fe94-c9b9-4a5a-963a-75d82a4eab94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.218295 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.218645 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.218712 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.218950 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.219000 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.219190 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-config-data-custom\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.220918 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-scripts\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.221352 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.236435 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.242183 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.242274 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.249374 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r7hn\" (UniqueName: \"kubernetes.io/projected/1800fe94-c9b9-4a5a-963a-75d82a4eab94-kube-api-access-8r7hn\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.249731 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1800fe94-c9b9-4a5a-963a-75d82a4eab94-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1800fe94-c9b9-4a5a-963a-75d82a4eab94\") " pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.461439 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.859746 4845 scope.go:117] "RemoveContainer" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" Feb 02 10:54:10 crc kubenswrapper[4845]: E0202 10:54:10.861257 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6d94ddcf58-g79zr_openstack(06e2175d-c446-4586-a3cb-e5819314abfe)\"" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.861701 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-central-agent" containerID="cri-o://65f19a5e53861afad5be0dc22811e405a7089640d2466a967d0205688cbae9c3" gracePeriod=30 Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.861913 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerStarted","Data":"f97ac7a8db772c50e60d2fbaebaa59d3f1748dd362df2a5683272ada45ea3c75"} Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.862788 4845 scope.go:117] "RemoveContainer" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" Feb 02 10:54:10 crc kubenswrapper[4845]: E0202 10:54:10.863126 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-655bdf96f4-zpj7r_openstack(1b30c63b-6407-4f1d-a393-c4f33a758db8)\"" pod="openstack/heat-api-655bdf96f4-zpj7r" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.863696 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.864278 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="sg-core" containerID="cri-o://3ebbd0d0a7130e66b17b81004ea1929c3e2d40fef4c6767da3df2300291382c2" gracePeriod=30 Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.864468 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-notification-agent" containerID="cri-o://1e8465847af87f5185fe9a371926a2dffc326ca101ce721ffdfe10ea1d45b3b5" gracePeriod=30 Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.867728 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="proxy-httpd" containerID="cri-o://f97ac7a8db772c50e60d2fbaebaa59d3f1748dd362df2a5683272ada45ea3c75" gracePeriod=30 Feb 02 10:54:10 crc kubenswrapper[4845]: I0202 10:54:10.941639 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.508295479 podStartE2EDuration="11.941616833s" podCreationTimestamp="2026-02-02 10:53:59 +0000 UTC" firstStartedPulling="2026-02-02 10:54:01.43481625 +0000 UTC m=+1322.526217700" lastFinishedPulling="2026-02-02 10:54:09.868137604 +0000 UTC m=+1330.959539054" observedRunningTime="2026-02-02 10:54:10.924678076 +0000 UTC m=+1332.016079526" watchObservedRunningTime="2026-02-02 10:54:10.941616833 +0000 UTC m=+1332.033018283" Feb 02 10:54:10 crc kubenswrapper[4845]: W0202 10:54:10.963618 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbdeff72_81f9_4063_8704_d97b21e01b82.slice/crio-f212d7b4211cc3dd8d32e4e27b7705352f55de84f33307cc1330c2a7479ce163 WatchSource:0}: Error finding container f212d7b4211cc3dd8d32e4e27b7705352f55de84f33307cc1330c2a7479ce163: Status 404 returned error can't find the container with id f212d7b4211cc3dd8d32e4e27b7705352f55de84f33307cc1330c2a7479ce163 Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.000136 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:11 crc kubenswrapper[4845]: W0202 10:54:11.208631 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1800fe94_c9b9_4a5a_963a_75d82a4eab94.slice/crio-763a710669e6fdae557c43f1cd05064fb8856e301fa761483c58eb11a44ecf07 WatchSource:0}: Error finding container 763a710669e6fdae557c43f1cd05064fb8856e301fa761483c58eb11a44ecf07: Status 404 returned error can't find the container with id 763a710669e6fdae557c43f1cd05064fb8856e301fa761483c58eb11a44ecf07 Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.223142 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.729687 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06105adf-bd97-410f-922f-cb54a637955d" path="/var/lib/kubelet/pods/06105adf-bd97-410f-922f-cb54a637955d/volumes" Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.730862 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f57453e0-7229-4521-9d4a-769dc8c888fa" path="/var/lib/kubelet/pods/f57453e0-7229-4521-9d4a-769dc8c888fa/volumes" Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.882281 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1800fe94-c9b9-4a5a-963a-75d82a4eab94","Type":"ContainerStarted","Data":"763a710669e6fdae557c43f1cd05064fb8856e301fa761483c58eb11a44ecf07"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886182 4845 generic.go:334] "Generic (PLEG): container finished" podID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerID="f97ac7a8db772c50e60d2fbaebaa59d3f1748dd362df2a5683272ada45ea3c75" exitCode=0 Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886214 4845 generic.go:334] "Generic (PLEG): container finished" podID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerID="3ebbd0d0a7130e66b17b81004ea1929c3e2d40fef4c6767da3df2300291382c2" exitCode=2 Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886223 4845 generic.go:334] "Generic (PLEG): container finished" podID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerID="1e8465847af87f5185fe9a371926a2dffc326ca101ce721ffdfe10ea1d45b3b5" exitCode=0 Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886255 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerDied","Data":"f97ac7a8db772c50e60d2fbaebaa59d3f1748dd362df2a5683272ada45ea3c75"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886323 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerDied","Data":"3ebbd0d0a7130e66b17b81004ea1929c3e2d40fef4c6767da3df2300291382c2"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.886340 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerDied","Data":"1e8465847af87f5185fe9a371926a2dffc326ca101ce721ffdfe10ea1d45b3b5"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.888013 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbdeff72-81f9-4063-8704-d97b21e01b82","Type":"ContainerStarted","Data":"95b6a6e8c90dd7982731049877f0620a009fbc1c4cb5abb5fe80f438d47aa8b1"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.888051 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbdeff72-81f9-4063-8704-d97b21e01b82","Type":"ContainerStarted","Data":"f212d7b4211cc3dd8d32e4e27b7705352f55de84f33307cc1330c2a7479ce163"} Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.889210 4845 scope.go:117] "RemoveContainer" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" Feb 02 10:54:11 crc kubenswrapper[4845]: I0202 10:54:11.889484 4845 scope.go:117] "RemoveContainer" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" Feb 02 10:54:11 crc kubenswrapper[4845]: E0202 10:54:11.889573 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6d94ddcf58-g79zr_openstack(06e2175d-c446-4586-a3cb-e5819314abfe)\"" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" Feb 02 10:54:11 crc kubenswrapper[4845]: E0202 10:54:11.889916 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-655bdf96f4-zpj7r_openstack(1b30c63b-6407-4f1d-a393-c4f33a758db8)\"" pod="openstack/heat-api-655bdf96f4-zpj7r" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.567878 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.568514 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.623792 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.626101 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.904215 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"fbdeff72-81f9-4063-8704-d97b21e01b82","Type":"ContainerStarted","Data":"8c275e40d70032eaf110681fe17246e3822b17ae904b3c83b5277490b47b3543"} Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.907031 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1800fe94-c9b9-4a5a-963a-75d82a4eab94","Type":"ContainerStarted","Data":"481de43b9e5202c6f309950a865e8f57a5f190ad0427bae791b66d535ea14cd0"} Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.907074 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1800fe94-c9b9-4a5a-963a-75d82a4eab94","Type":"ContainerStarted","Data":"14b905db344b97fb6fd4ec19c9968f2d05e85faf5dc8d6ecb7839ad6c0cb2410"} Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.907554 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.907594 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.927512 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.927480946 podStartE2EDuration="3.927480946s" podCreationTimestamp="2026-02-02 10:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:12.923659756 +0000 UTC m=+1334.015061226" watchObservedRunningTime="2026-02-02 10:54:12.927480946 +0000 UTC m=+1334.018882396" Feb 02 10:54:12 crc kubenswrapper[4845]: I0202 10:54:12.962452 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.962425312 podStartE2EDuration="3.962425312s" podCreationTimestamp="2026-02-02 10:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:12.951650372 +0000 UTC m=+1334.043051832" watchObservedRunningTime="2026-02-02 10:54:12.962425312 +0000 UTC m=+1334.053826762" Feb 02 10:54:13 crc kubenswrapper[4845]: I0202 10:54:13.155005 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:54:13 crc kubenswrapper[4845]: I0202 10:54:13.921837 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:54:14 crc kubenswrapper[4845]: I0202 10:54:14.871124 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.258987 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6dcccd9c6c-tq64l" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.372109 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.456858 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.456994 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.739123 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-7998b4fc87-n5g2f" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.833856 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.881500 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.951494 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.972420 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.972689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6d94ddcf58-g79zr" event={"ID":"06e2175d-c446-4586-a3cb-e5819314abfe","Type":"ContainerDied","Data":"5e9e84a85d37d5a663e5d2555b2bd6af1b0b56601697215185d84529a118289f"} Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.972759 4845 scope.go:117] "RemoveContainer" containerID="2d3403ecc8aab1e7f1cb36779d4b45fbc1dacabb752fd06184d50a7990b505be" Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.977480 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle\") pod \"06e2175d-c446-4586-a3cb-e5819314abfe\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.977649 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data\") pod \"06e2175d-c446-4586-a3cb-e5819314abfe\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.977687 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvzds\" (UniqueName: \"kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds\") pod \"06e2175d-c446-4586-a3cb-e5819314abfe\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.977813 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom\") pod \"06e2175d-c446-4586-a3cb-e5819314abfe\" (UID: \"06e2175d-c446-4586-a3cb-e5819314abfe\") " Feb 02 10:54:15 crc kubenswrapper[4845]: I0202 10:54:15.998356 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "06e2175d-c446-4586-a3cb-e5819314abfe" (UID: "06e2175d-c446-4586-a3cb-e5819314abfe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.000547 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds" (OuterVolumeSpecName: "kube-api-access-xvzds") pod "06e2175d-c446-4586-a3cb-e5819314abfe" (UID: "06e2175d-c446-4586-a3cb-e5819314abfe"). InnerVolumeSpecName "kube-api-access-xvzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.048248 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06e2175d-c446-4586-a3cb-e5819314abfe" (UID: "06e2175d-c446-4586-a3cb-e5819314abfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.075347 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data" (OuterVolumeSpecName: "config-data") pod "06e2175d-c446-4586-a3cb-e5819314abfe" (UID: "06e2175d-c446-4586-a3cb-e5819314abfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.084115 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.084161 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvzds\" (UniqueName: \"kubernetes.io/projected/06e2175d-c446-4586-a3cb-e5819314abfe-kube-api-access-xvzds\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.084172 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.084183 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06e2175d-c446-4586-a3cb-e5819314abfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.238418 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.238491 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.238543 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.239742 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.239796 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c" gracePeriod=600 Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.289896 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.354980 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.377333 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6d94ddcf58-g79zr"] Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.389709 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom\") pod \"1b30c63b-6407-4f1d-a393-c4f33a758db8\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.390382 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle\") pod \"1b30c63b-6407-4f1d-a393-c4f33a758db8\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.390504 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data\") pod \"1b30c63b-6407-4f1d-a393-c4f33a758db8\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.390651 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8jn2\" (UniqueName: \"kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2\") pod \"1b30c63b-6407-4f1d-a393-c4f33a758db8\" (UID: \"1b30c63b-6407-4f1d-a393-c4f33a758db8\") " Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.398269 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2" (OuterVolumeSpecName: "kube-api-access-j8jn2") pod "1b30c63b-6407-4f1d-a393-c4f33a758db8" (UID: "1b30c63b-6407-4f1d-a393-c4f33a758db8"). InnerVolumeSpecName "kube-api-access-j8jn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.402184 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1b30c63b-6407-4f1d-a393-c4f33a758db8" (UID: "1b30c63b-6407-4f1d-a393-c4f33a758db8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.440772 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b30c63b-6407-4f1d-a393-c4f33a758db8" (UID: "1b30c63b-6407-4f1d-a393-c4f33a758db8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.486949 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data" (OuterVolumeSpecName: "config-data") pod "1b30c63b-6407-4f1d-a393-c4f33a758db8" (UID: "1b30c63b-6407-4f1d-a393-c4f33a758db8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.493696 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.493738 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8jn2\" (UniqueName: \"kubernetes.io/projected/1b30c63b-6407-4f1d-a393-c4f33a758db8-kube-api-access-j8jn2\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.493752 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.493764 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30c63b-6407-4f1d-a393-c4f33a758db8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.984409 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-655bdf96f4-zpj7r" event={"ID":"1b30c63b-6407-4f1d-a393-c4f33a758db8","Type":"ContainerDied","Data":"1958eba93e5442b126732bbc2709ccb6a5ad1ffec89971c483a8a27cda81d546"} Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.984660 4845 scope.go:117] "RemoveContainer" containerID="515ee1e3ff8694e116cc5e516b9da90abbcfc1e6bc587e3f4003c38c236374a8" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.984448 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-655bdf96f4-zpj7r" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.990609 4845 generic.go:334] "Generic (PLEG): container finished" podID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerID="65f19a5e53861afad5be0dc22811e405a7089640d2466a967d0205688cbae9c3" exitCode=0 Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.990664 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerDied","Data":"65f19a5e53861afad5be0dc22811e405a7089640d2466a967d0205688cbae9c3"} Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.990688 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"53bef472-98d8-47d6-9601-fa9bc8438a5d","Type":"ContainerDied","Data":"3ed9e63dbb2ac681fa2a644a55a08157b6bb091be1c082086d50bfa19e192457"} Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.990699 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ed9e63dbb2ac681fa2a644a55a08157b6bb091be1c082086d50bfa19e192457" Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.996168 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c" exitCode=0 Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.996250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c"} Feb 02 10:54:16 crc kubenswrapper[4845]: I0202 10:54:16.996318 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020"} Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.050843 4845 scope.go:117] "RemoveContainer" containerID="b265cb3810e3935261baf8bbd2287ce4faf34ceae4eb09c4d8144e547b3debd5" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.098728 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.109720 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.109828 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110008 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110063 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjdd\" (UniqueName: \"kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110131 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110234 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110289 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle\") pod \"53bef472-98d8-47d6-9601-fa9bc8438a5d\" (UID: \"53bef472-98d8-47d6-9601-fa9bc8438a5d\") " Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110277 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.110558 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.111255 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.111281 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/53bef472-98d8-47d6-9601-fa9bc8438a5d-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.115089 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd" (OuterVolumeSpecName: "kube-api-access-6bjdd") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "kube-api-access-6bjdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.118155 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts" (OuterVolumeSpecName: "scripts") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.119717 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.135013 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-655bdf96f4-zpj7r"] Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.154201 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.215251 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.215276 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjdd\" (UniqueName: \"kubernetes.io/projected/53bef472-98d8-47d6-9601-fa9bc8438a5d-kube-api-access-6bjdd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.215286 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.253024 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.257801 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data" (OuterVolumeSpecName: "config-data") pod "53bef472-98d8-47d6-9601-fa9bc8438a5d" (UID: "53bef472-98d8-47d6-9601-fa9bc8438a5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.316834 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.316869 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53bef472-98d8-47d6-9601-fa9bc8438a5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.758654 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" path="/var/lib/kubelet/pods/06e2175d-c446-4586-a3cb-e5819314abfe/volumes" Feb 02 10:54:17 crc kubenswrapper[4845]: I0202 10:54:17.759435 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" path="/var/lib/kubelet/pods/1b30c63b-6407-4f1d-a393-c4f33a758db8/volumes" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.021003 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.050151 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.069121 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.088695 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089250 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="proxy-httpd" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089267 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="proxy-httpd" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089289 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="sg-core" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089296 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="sg-core" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089312 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-notification-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089319 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-notification-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089328 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-central-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089334 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-central-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089351 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089357 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089370 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089377 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089388 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089393 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: E0202 10:54:18.089419 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089425 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089626 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089639 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="sg-core" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089652 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b30c63b-6407-4f1d-a393-c4f33a758db8" containerName="heat-api" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089665 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-notification-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089677 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089688 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="proxy-httpd" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089703 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="06e2175d-c446-4586-a3cb-e5819314abfe" containerName="heat-cfnapi" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.089713 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" containerName="ceilometer-central-agent" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.091737 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.093655 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.095230 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.102254 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.255690 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v49d\" (UniqueName: \"kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.255806 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.255919 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.255989 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.256115 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.256145 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.256163 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.360868 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365320 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365385 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365414 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365551 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v49d\" (UniqueName: \"kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365598 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.365704 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.366334 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.366921 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.368462 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.369567 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.369774 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.374856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.389412 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v49d\" (UniqueName: \"kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d\") pod \"ceilometer-0\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.425191 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:18 crc kubenswrapper[4845]: I0202 10:54:18.938597 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:19 crc kubenswrapper[4845]: I0202 10:54:19.036708 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerStarted","Data":"2e307c045dfca6f5ca82d2578694038311524ca7900bb0337d06405f23ff2a24"} Feb 02 10:54:19 crc kubenswrapper[4845]: E0202 10:54:19.590299 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:19 crc kubenswrapper[4845]: I0202 10:54:19.733115 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53bef472-98d8-47d6-9601-fa9bc8438a5d" path="/var/lib/kubelet/pods/53bef472-98d8-47d6-9601-fa9bc8438a5d/volumes" Feb 02 10:54:19 crc kubenswrapper[4845]: I0202 10:54:19.968737 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.053245 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerStarted","Data":"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448"} Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.220098 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.220421 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.249347 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-658dbb4bcd-qn5fs" Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.301544 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.303183 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.319553 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:54:20 crc kubenswrapper[4845]: I0202 10:54:20.320005 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-567746f76f-zjfmt" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" containerName="heat-engine" containerID="cri-o://307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" gracePeriod=60 Feb 02 10:54:21 crc kubenswrapper[4845]: I0202 10:54:21.066783 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerStarted","Data":"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb"} Feb 02 10:54:21 crc kubenswrapper[4845]: I0202 10:54:21.068079 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:21 crc kubenswrapper[4845]: I0202 10:54:21.068204 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:22 crc kubenswrapper[4845]: I0202 10:54:22.084040 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerStarted","Data":"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef"} Feb 02 10:54:23 crc kubenswrapper[4845]: I0202 10:54:23.114351 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:54:23 crc kubenswrapper[4845]: I0202 10:54:23.115453 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:54:23 crc kubenswrapper[4845]: I0202 10:54:23.116717 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 10:54:23 crc kubenswrapper[4845]: E0202 10:54:23.118740 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:54:23 crc kubenswrapper[4845]: E0202 10:54:23.123271 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:54:23 crc kubenswrapper[4845]: E0202 10:54:23.125115 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 10:54:23 crc kubenswrapper[4845]: E0202 10:54:23.125169 4845 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-567746f76f-zjfmt" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" containerName="heat-engine" Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.130805 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerStarted","Data":"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c"} Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.132341 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.131707 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="proxy-httpd" containerID="cri-o://9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c" gracePeriod=30 Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.131033 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-central-agent" containerID="cri-o://41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448" gracePeriod=30 Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.131740 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-notification-agent" containerID="cri-o://135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb" gracePeriod=30 Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.131726 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="sg-core" containerID="cri-o://dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef" gracePeriod=30 Feb 02 10:54:24 crc kubenswrapper[4845]: I0202 10:54:24.175713 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.601790529 podStartE2EDuration="6.175687912s" podCreationTimestamp="2026-02-02 10:54:18 +0000 UTC" firstStartedPulling="2026-02-02 10:54:18.961453041 +0000 UTC m=+1340.052854491" lastFinishedPulling="2026-02-02 10:54:23.535350424 +0000 UTC m=+1344.626751874" observedRunningTime="2026-02-02 10:54:24.166557959 +0000 UTC m=+1345.257959409" watchObservedRunningTime="2026-02-02 10:54:24.175687912 +0000 UTC m=+1345.267089372" Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.146774 4845 generic.go:334] "Generic (PLEG): container finished" podID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerID="dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef" exitCode=2 Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.147147 4845 generic.go:334] "Generic (PLEG): container finished" podID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerID="135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb" exitCode=0 Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.146842 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerDied","Data":"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef"} Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.147201 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerDied","Data":"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb"} Feb 02 10:54:25 crc kubenswrapper[4845]: E0202 10:54:25.432416 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.563619 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.564047 4845 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:54:25 crc kubenswrapper[4845]: I0202 10:54:25.566445 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.100971 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-p6lbd"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.103368 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.116872 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p6lbd"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.197860 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.197988 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kw8\" (UniqueName: \"kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.213411 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8fed-account-create-update-p76r4"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.214913 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.216944 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.224266 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8fed-account-create-update-p76r4"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.301051 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.301138 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.301255 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kw8\" (UniqueName: \"kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.301335 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58s4c\" (UniqueName: \"kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.301987 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.308784 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-69t2n"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.310727 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.324452 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-69t2n"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.333692 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kw8\" (UniqueName: \"kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8\") pod \"nova-api-db-create-p6lbd\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.409441 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.409533 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.409634 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rsvt\" (UniqueName: \"kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.409688 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58s4c\" (UniqueName: \"kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.410955 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.431938 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vkhkq"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.433946 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.434586 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58s4c\" (UniqueName: \"kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c\") pod \"nova-api-8fed-account-create-update-p76r4\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.436223 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.449925 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-dfd5-account-create-update-9mdwn"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.451451 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.456645 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.471944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vkhkq"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.503938 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dfd5-account-create-update-9mdwn"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.513094 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rsvt\" (UniqueName: \"kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.513145 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn6m2\" (UniqueName: \"kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.513317 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.513439 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.514356 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.530679 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.543689 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rsvt\" (UniqueName: \"kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt\") pod \"nova-cell0-db-create-69t2n\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.615196 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn6m2\" (UniqueName: \"kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.615324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.615393 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxqp\" (UniqueName: \"kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.615420 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.616596 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.629337 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4ab2-account-create-update-l992n"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.633453 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.639309 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.642267 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn6m2\" (UniqueName: \"kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2\") pod \"nova-cell1-db-create-vkhkq\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.644217 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.645040 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ab2-account-create-update-l992n"] Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.699057 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.723859 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgmvl\" (UniqueName: \"kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.726122 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.726289 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxqp\" (UniqueName: \"kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.726354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.729977 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.769040 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxqp\" (UniqueName: \"kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp\") pod \"nova-cell0-dfd5-account-create-update-9mdwn\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.839649 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgmvl\" (UniqueName: \"kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.839832 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.847621 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:27 crc kubenswrapper[4845]: I0202 10:54:27.874007 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgmvl\" (UniqueName: \"kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl\") pod \"nova-cell1-4ab2-account-create-update-l992n\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.026938 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.056170 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.295789 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p6lbd"] Feb 02 10:54:28 crc kubenswrapper[4845]: W0202 10:54:28.298273 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36b758e3_acc2_451a_b64d_9c53a7e5f98f.slice/crio-a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b WatchSource:0}: Error finding container a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b: Status 404 returned error can't find the container with id a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.381817 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8fed-account-create-update-p76r4"] Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.737460 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-69t2n"] Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.782791 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vkhkq"] Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.972632 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4ab2-account-create-update-l992n"] Feb 02 10:54:28 crc kubenswrapper[4845]: I0202 10:54:28.998758 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-dfd5-account-create-update-9mdwn"] Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.282722 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" event={"ID":"93e02369-64e6-46f8-a84d-f50396230784","Type":"ContainerStarted","Data":"def3159dca0d08156117550efff6e96fe43dd7fc3991f81ebd27a9fa9cd35f91"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.286637 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" event={"ID":"08a7f3c4-2a4a-4d07-91ee-27a63961c272","Type":"ContainerStarted","Data":"1ac2b596634ce8baec011403f3175913f50d5fac461558bf9df2981fd732ceaf"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.309306 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vkhkq" event={"ID":"b9ca8c7e-f45d-4014-9599-2ba08495811f","Type":"ContainerStarted","Data":"c959dbd0617786458da203ebfec1f5f7d7cbfed06b8a763a051d64aecc2aaf06"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.316642 4845 generic.go:334] "Generic (PLEG): container finished" podID="621aa5b7-f496-48f4-a72d-74e8886f813e" containerID="8360a8bbf698c5c922fa9756941c2a4df903063c7069ac543b4e17f4e1f546d5" exitCode=0 Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.316774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8fed-account-create-update-p76r4" event={"ID":"621aa5b7-f496-48f4-a72d-74e8886f813e","Type":"ContainerDied","Data":"8360a8bbf698c5c922fa9756941c2a4df903063c7069ac543b4e17f4e1f546d5"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.316812 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8fed-account-create-update-p76r4" event={"ID":"621aa5b7-f496-48f4-a72d-74e8886f813e","Type":"ContainerStarted","Data":"f76b13f78cfbc517da005a059719e5fa9f1bfa3650b860b4c36933580a57ddff"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.322443 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-69t2n" event={"ID":"65acb40f-b003-4d37-93c0-4198beba28ed","Type":"ContainerStarted","Data":"5f33208b080e0c59e43af57ad0c35b7e52589ea9537af21b95c6b74465223db0"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.329659 4845 generic.go:334] "Generic (PLEG): container finished" podID="36b758e3-acc2-451a-b64d-9c53a7e5f98f" containerID="d9f965fd9f9dbd88dc973b197a2f3c57c9e0f963db241ff90038d5e96106751f" exitCode=0 Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.329721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p6lbd" event={"ID":"36b758e3-acc2-451a-b64d-9c53a7e5f98f","Type":"ContainerDied","Data":"d9f965fd9f9dbd88dc973b197a2f3c57c9e0f963db241ff90038d5e96106751f"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.329747 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p6lbd" event={"ID":"36b758e3-acc2-451a-b64d-9c53a7e5f98f","Type":"ContainerStarted","Data":"a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b"} Feb 02 10:54:29 crc kubenswrapper[4845]: I0202 10:54:29.400040 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-69t2n" podStartSLOduration=2.400016094 podStartE2EDuration="2.400016094s" podCreationTimestamp="2026-02-02 10:54:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:29.383698914 +0000 UTC m=+1350.475100364" watchObservedRunningTime="2026-02-02 10:54:29.400016094 +0000 UTC m=+1350.491417544" Feb 02 10:54:29 crc kubenswrapper[4845]: E0202 10:54:29.676319 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.373394 4845 generic.go:334] "Generic (PLEG): container finished" podID="93e02369-64e6-46f8-a84d-f50396230784" containerID="74d2907667c2c914fc57daa5fed9146112b93f8bf9f401e92958c5811e8dc6f3" exitCode=0 Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.373721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" event={"ID":"93e02369-64e6-46f8-a84d-f50396230784","Type":"ContainerDied","Data":"74d2907667c2c914fc57daa5fed9146112b93f8bf9f401e92958c5811e8dc6f3"} Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.378422 4845 generic.go:334] "Generic (PLEG): container finished" podID="08a7f3c4-2a4a-4d07-91ee-27a63961c272" containerID="6019d47869688d15147a8932c0b84dab21fa14445dd8f012a8b145c6bed8a74d" exitCode=0 Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.378490 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" event={"ID":"08a7f3c4-2a4a-4d07-91ee-27a63961c272","Type":"ContainerDied","Data":"6019d47869688d15147a8932c0b84dab21fa14445dd8f012a8b145c6bed8a74d"} Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.384491 4845 generic.go:334] "Generic (PLEG): container finished" podID="30fbb4bb-1391-411d-adda-a41d223aed00" containerID="307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" exitCode=0 Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.384572 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-567746f76f-zjfmt" event={"ID":"30fbb4bb-1391-411d-adda-a41d223aed00","Type":"ContainerDied","Data":"307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93"} Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.394756 4845 generic.go:334] "Generic (PLEG): container finished" podID="b9ca8c7e-f45d-4014-9599-2ba08495811f" containerID="b7177510879626b8e93e3bec07d3378fb19c2869cf116cbd055fdaad370ef7da" exitCode=0 Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.395198 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vkhkq" event={"ID":"b9ca8c7e-f45d-4014-9599-2ba08495811f","Type":"ContainerDied","Data":"b7177510879626b8e93e3bec07d3378fb19c2869cf116cbd055fdaad370ef7da"} Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.399132 4845 generic.go:334] "Generic (PLEG): container finished" podID="65acb40f-b003-4d37-93c0-4198beba28ed" containerID="8ac663309bf94373ab6aee2bd864d56af41199dceccdc883fa0aff4b2aa502f8" exitCode=0 Feb 02 10:54:30 crc kubenswrapper[4845]: I0202 10:54:30.399591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-69t2n" event={"ID":"65acb40f-b003-4d37-93c0-4198beba28ed","Type":"ContainerDied","Data":"8ac663309bf94373ab6aee2bd864d56af41199dceccdc883fa0aff4b2aa502f8"} Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.137791 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.147270 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.160767 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.267668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data\") pod \"30fbb4bb-1391-411d-adda-a41d223aed00\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.267714 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle\") pod \"30fbb4bb-1391-411d-adda-a41d223aed00\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.267745 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf6hs\" (UniqueName: \"kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs\") pod \"30fbb4bb-1391-411d-adda-a41d223aed00\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.268942 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts\") pod \"621aa5b7-f496-48f4-a72d-74e8886f813e\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.269100 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58s4c\" (UniqueName: \"kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c\") pod \"621aa5b7-f496-48f4-a72d-74e8886f813e\" (UID: \"621aa5b7-f496-48f4-a72d-74e8886f813e\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.269159 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts\") pod \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.269247 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom\") pod \"30fbb4bb-1391-411d-adda-a41d223aed00\" (UID: \"30fbb4bb-1391-411d-adda-a41d223aed00\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.269339 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2kw8\" (UniqueName: \"kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8\") pod \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\" (UID: \"36b758e3-acc2-451a-b64d-9c53a7e5f98f\") " Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.269697 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "621aa5b7-f496-48f4-a72d-74e8886f813e" (UID: "621aa5b7-f496-48f4-a72d-74e8886f813e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.270172 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/621aa5b7-f496-48f4-a72d-74e8886f813e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.270318 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36b758e3-acc2-451a-b64d-9c53a7e5f98f" (UID: "36b758e3-acc2-451a-b64d-9c53a7e5f98f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.279824 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c" (OuterVolumeSpecName: "kube-api-access-58s4c") pod "621aa5b7-f496-48f4-a72d-74e8886f813e" (UID: "621aa5b7-f496-48f4-a72d-74e8886f813e"). InnerVolumeSpecName "kube-api-access-58s4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.280334 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs" (OuterVolumeSpecName: "kube-api-access-cf6hs") pod "30fbb4bb-1391-411d-adda-a41d223aed00" (UID: "30fbb4bb-1391-411d-adda-a41d223aed00"). InnerVolumeSpecName "kube-api-access-cf6hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.282229 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "30fbb4bb-1391-411d-adda-a41d223aed00" (UID: "30fbb4bb-1391-411d-adda-a41d223aed00"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.300860 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8" (OuterVolumeSpecName: "kube-api-access-q2kw8") pod "36b758e3-acc2-451a-b64d-9c53a7e5f98f" (UID: "36b758e3-acc2-451a-b64d-9c53a7e5f98f"). InnerVolumeSpecName "kube-api-access-q2kw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.313343 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30fbb4bb-1391-411d-adda-a41d223aed00" (UID: "30fbb4bb-1391-411d-adda-a41d223aed00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.361352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data" (OuterVolumeSpecName: "config-data") pod "30fbb4bb-1391-411d-adda-a41d223aed00" (UID: "30fbb4bb-1391-411d-adda-a41d223aed00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375730 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375777 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2kw8\" (UniqueName: \"kubernetes.io/projected/36b758e3-acc2-451a-b64d-9c53a7e5f98f-kube-api-access-q2kw8\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375795 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375808 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30fbb4bb-1391-411d-adda-a41d223aed00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375821 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf6hs\" (UniqueName: \"kubernetes.io/projected/30fbb4bb-1391-411d-adda-a41d223aed00-kube-api-access-cf6hs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375833 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58s4c\" (UniqueName: \"kubernetes.io/projected/621aa5b7-f496-48f4-a72d-74e8886f813e-kube-api-access-58s4c\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.375846 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36b758e3-acc2-451a-b64d-9c53a7e5f98f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.451699 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p6lbd" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.451740 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p6lbd" event={"ID":"36b758e3-acc2-451a-b64d-9c53a7e5f98f","Type":"ContainerDied","Data":"a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b"} Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.451789 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2dc59b71d4e75e917f735576836e82d191a7123b82395f183042e4e8602897b" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.459614 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-567746f76f-zjfmt" event={"ID":"30fbb4bb-1391-411d-adda-a41d223aed00","Type":"ContainerDied","Data":"2ed051f1edd72eac12419e5ea83d9cdd76f867860395bc32f8a07b0d289f477d"} Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.459689 4845 scope.go:117] "RemoveContainer" containerID="307d8b0b23eff351361b7a613060b4a588e27fc82cd046b95b3491796e7afc93" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.459728 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-567746f76f-zjfmt" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.465878 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8fed-account-create-update-p76r4" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.467478 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8fed-account-create-update-p76r4" event={"ID":"621aa5b7-f496-48f4-a72d-74e8886f813e","Type":"ContainerDied","Data":"f76b13f78cfbc517da005a059719e5fa9f1bfa3650b860b4c36933580a57ddff"} Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.467544 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f76b13f78cfbc517da005a059719e5fa9f1bfa3650b860b4c36933580a57ddff" Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.578935 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.591044 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-567746f76f-zjfmt"] Feb 02 10:54:31 crc kubenswrapper[4845]: I0202 10:54:31.755561 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" path="/var/lib/kubelet/pods/30fbb4bb-1391-411d-adda-a41d223aed00/volumes" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.005197 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.096327 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts\") pod \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.096434 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgmvl\" (UniqueName: \"kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl\") pod \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\" (UID: \"08a7f3c4-2a4a-4d07-91ee-27a63961c272\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.099558 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08a7f3c4-2a4a-4d07-91ee-27a63961c272" (UID: "08a7f3c4-2a4a-4d07-91ee-27a63961c272"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.131457 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl" (OuterVolumeSpecName: "kube-api-access-cgmvl") pod "08a7f3c4-2a4a-4d07-91ee-27a63961c272" (UID: "08a7f3c4-2a4a-4d07-91ee-27a63961c272"). InnerVolumeSpecName "kube-api-access-cgmvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.188220 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.200696 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08a7f3c4-2a4a-4d07-91ee-27a63961c272-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.200735 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgmvl\" (UniqueName: \"kubernetes.io/projected/08a7f3c4-2a4a-4d07-91ee-27a63961c272-kube-api-access-cgmvl\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.201954 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.210781 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.301728 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxqp\" (UniqueName: \"kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp\") pod \"93e02369-64e6-46f8-a84d-f50396230784\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.301823 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts\") pod \"93e02369-64e6-46f8-a84d-f50396230784\" (UID: \"93e02369-64e6-46f8-a84d-f50396230784\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.301946 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts\") pod \"65acb40f-b003-4d37-93c0-4198beba28ed\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.302041 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn6m2\" (UniqueName: \"kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2\") pod \"b9ca8c7e-f45d-4014-9599-2ba08495811f\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.302133 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts\") pod \"b9ca8c7e-f45d-4014-9599-2ba08495811f\" (UID: \"b9ca8c7e-f45d-4014-9599-2ba08495811f\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.302247 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rsvt\" (UniqueName: \"kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt\") pod \"65acb40f-b003-4d37-93c0-4198beba28ed\" (UID: \"65acb40f-b003-4d37-93c0-4198beba28ed\") " Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.303638 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "65acb40f-b003-4d37-93c0-4198beba28ed" (UID: "65acb40f-b003-4d37-93c0-4198beba28ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.306814 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp" (OuterVolumeSpecName: "kube-api-access-8nxqp") pod "93e02369-64e6-46f8-a84d-f50396230784" (UID: "93e02369-64e6-46f8-a84d-f50396230784"). InnerVolumeSpecName "kube-api-access-8nxqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.307231 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93e02369-64e6-46f8-a84d-f50396230784" (UID: "93e02369-64e6-46f8-a84d-f50396230784"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.309569 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9ca8c7e-f45d-4014-9599-2ba08495811f" (UID: "b9ca8c7e-f45d-4014-9599-2ba08495811f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.311468 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2" (OuterVolumeSpecName: "kube-api-access-wn6m2") pod "b9ca8c7e-f45d-4014-9599-2ba08495811f" (UID: "b9ca8c7e-f45d-4014-9599-2ba08495811f"). InnerVolumeSpecName "kube-api-access-wn6m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.331118 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt" (OuterVolumeSpecName: "kube-api-access-5rsvt") pod "65acb40f-b003-4d37-93c0-4198beba28ed" (UID: "65acb40f-b003-4d37-93c0-4198beba28ed"). InnerVolumeSpecName "kube-api-access-5rsvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406752 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9ca8c7e-f45d-4014-9599-2ba08495811f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406787 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rsvt\" (UniqueName: \"kubernetes.io/projected/65acb40f-b003-4d37-93c0-4198beba28ed-kube-api-access-5rsvt\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406798 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nxqp\" (UniqueName: \"kubernetes.io/projected/93e02369-64e6-46f8-a84d-f50396230784-kube-api-access-8nxqp\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406807 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93e02369-64e6-46f8-a84d-f50396230784-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406816 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/65acb40f-b003-4d37-93c0-4198beba28ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.406824 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn6m2\" (UniqueName: \"kubernetes.io/projected/b9ca8c7e-f45d-4014-9599-2ba08495811f-kube-api-access-wn6m2\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.478812 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" event={"ID":"93e02369-64e6-46f8-a84d-f50396230784","Type":"ContainerDied","Data":"def3159dca0d08156117550efff6e96fe43dd7fc3991f81ebd27a9fa9cd35f91"} Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.478901 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def3159dca0d08156117550efff6e96fe43dd7fc3991f81ebd27a9fa9cd35f91" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.478837 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-dfd5-account-create-update-9mdwn" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.480359 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" event={"ID":"08a7f3c4-2a4a-4d07-91ee-27a63961c272","Type":"ContainerDied","Data":"1ac2b596634ce8baec011403f3175913f50d5fac461558bf9df2981fd732ceaf"} Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.480401 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac2b596634ce8baec011403f3175913f50d5fac461558bf9df2981fd732ceaf" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.480451 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4ab2-account-create-update-l992n" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.495679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vkhkq" event={"ID":"b9ca8c7e-f45d-4014-9599-2ba08495811f","Type":"ContainerDied","Data":"c959dbd0617786458da203ebfec1f5f7d7cbfed06b8a763a051d64aecc2aaf06"} Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.495723 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c959dbd0617786458da203ebfec1f5f7d7cbfed06b8a763a051d64aecc2aaf06" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.495792 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vkhkq" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.498875 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-69t2n" event={"ID":"65acb40f-b003-4d37-93c0-4198beba28ed","Type":"ContainerDied","Data":"5f33208b080e0c59e43af57ad0c35b7e52589ea9537af21b95c6b74465223db0"} Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.498953 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f33208b080e0c59e43af57ad0c35b7e52589ea9537af21b95c6b74465223db0" Feb 02 10:54:32 crc kubenswrapper[4845]: I0202 10:54:32.499018 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-69t2n" Feb 02 10:54:34 crc kubenswrapper[4845]: I0202 10:54:34.528429 4845 generic.go:334] "Generic (PLEG): container finished" podID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerID="41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448" exitCode=0 Feb 02 10:54:34 crc kubenswrapper[4845]: I0202 10:54:34.528540 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerDied","Data":"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448"} Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.752172 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qwpzq"] Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753095 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9ca8c7e-f45d-4014-9599-2ba08495811f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753115 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9ca8c7e-f45d-4014-9599-2ba08495811f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753141 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b758e3-acc2-451a-b64d-9c53a7e5f98f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753149 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b758e3-acc2-451a-b64d-9c53a7e5f98f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753163 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621aa5b7-f496-48f4-a72d-74e8886f813e" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753170 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="621aa5b7-f496-48f4-a72d-74e8886f813e" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753184 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e02369-64e6-46f8-a84d-f50396230784" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753192 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e02369-64e6-46f8-a84d-f50396230784" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753204 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65acb40f-b003-4d37-93c0-4198beba28ed" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753211 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="65acb40f-b003-4d37-93c0-4198beba28ed" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753221 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a7f3c4-2a4a-4d07-91ee-27a63961c272" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753228 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a7f3c4-2a4a-4d07-91ee-27a63961c272" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: E0202 10:54:37.753253 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" containerName="heat-engine" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753260 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" containerName="heat-engine" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753537 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b758e3-acc2-451a-b64d-9c53a7e5f98f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753557 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a7f3c4-2a4a-4d07-91ee-27a63961c272" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753574 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9ca8c7e-f45d-4014-9599-2ba08495811f" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753594 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e02369-64e6-46f8-a84d-f50396230784" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753603 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="621aa5b7-f496-48f4-a72d-74e8886f813e" containerName="mariadb-account-create-update" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753619 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="65acb40f-b003-4d37-93c0-4198beba28ed" containerName="mariadb-database-create" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.753634 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="30fbb4bb-1391-411d-adda-a41d223aed00" containerName="heat-engine" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.754597 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.760621 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.761074 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8lh6g" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.761370 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.795425 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qwpzq"] Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.842377 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.842523 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkklk\" (UniqueName: \"kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.842601 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.842630 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.945236 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.945360 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkklk\" (UniqueName: \"kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.945427 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.945460 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.957776 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.957934 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.957960 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:37 crc kubenswrapper[4845]: I0202 10:54:37.966896 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkklk\" (UniqueName: \"kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk\") pod \"nova-cell0-conductor-db-sync-qwpzq\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:38 crc kubenswrapper[4845]: I0202 10:54:38.078553 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:54:38 crc kubenswrapper[4845]: I0202 10:54:38.564805 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qwpzq"] Feb 02 10:54:38 crc kubenswrapper[4845]: I0202 10:54:38.599903 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" event={"ID":"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff","Type":"ContainerStarted","Data":"7649afe14ec5ab0dbef279c4cc985147cdb3a346b125d477468bb7cabfb65001"} Feb 02 10:54:40 crc kubenswrapper[4845]: E0202 10:54:40.071534 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:40 crc kubenswrapper[4845]: E0202 10:54:40.183617 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:48 crc kubenswrapper[4845]: E0202 10:54:48.242369 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:48 crc kubenswrapper[4845]: E0202 10:54:48.244742 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:48 crc kubenswrapper[4845]: I0202 10:54:48.445947 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.244254 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-mjh66"] Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.251299 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.262828 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mjh66"] Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.285763 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0d03-account-create-update-79bpm"] Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.292455 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.294758 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.308313 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0d03-account-create-update-79bpm"] Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.463746 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.463922 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgpr\" (UniqueName: \"kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.464108 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdfk\" (UniqueName: \"kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.464346 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.566185 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgpr\" (UniqueName: \"kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.566324 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdfk\" (UniqueName: \"kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.566387 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.566489 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.567296 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.568581 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.590503 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgpr\" (UniqueName: \"kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr\") pod \"aodh-0d03-account-create-update-79bpm\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.592014 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdfk\" (UniqueName: \"kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk\") pod \"aodh-db-create-mjh66\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.638155 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.772564 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" event={"ID":"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff","Type":"ContainerStarted","Data":"4bd5dbe3f7a8b7903c6ee652f72c588239fb784e24ebb2f47f5ca3b9452668fa"} Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.796615 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" podStartSLOduration=2.527051698 podStartE2EDuration="12.796592214s" podCreationTimestamp="2026-02-02 10:54:37 +0000 UTC" firstStartedPulling="2026-02-02 10:54:38.568587739 +0000 UTC m=+1359.659989199" lastFinishedPulling="2026-02-02 10:54:48.838128265 +0000 UTC m=+1369.929529715" observedRunningTime="2026-02-02 10:54:49.787023228 +0000 UTC m=+1370.878424678" watchObservedRunningTime="2026-02-02 10:54:49.796592214 +0000 UTC m=+1370.887993664" Feb 02 10:54:49 crc kubenswrapper[4845]: I0202 10:54:49.873102 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:50 crc kubenswrapper[4845]: E0202 10:54:50.146932 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:50 crc kubenswrapper[4845]: I0202 10:54:50.423042 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0d03-account-create-update-79bpm"] Feb 02 10:54:50 crc kubenswrapper[4845]: I0202 10:54:50.548810 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-mjh66"] Feb 02 10:54:50 crc kubenswrapper[4845]: W0202 10:54:50.551111 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf45c3661_e66b_41a2_9a98_db215df0b2cf.slice/crio-e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216 WatchSource:0}: Error finding container e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216: Status 404 returned error can't find the container with id e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216 Feb 02 10:54:50 crc kubenswrapper[4845]: I0202 10:54:50.786777 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mjh66" event={"ID":"f45c3661-e66b-41a2-9a98-db215df0b2cf","Type":"ContainerStarted","Data":"e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216"} Feb 02 10:54:50 crc kubenswrapper[4845]: I0202 10:54:50.789123 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0d03-account-create-update-79bpm" event={"ID":"7c902530-dc88-4300-9356-1f3938cfef4a","Type":"ContainerStarted","Data":"71cf5e6659540df9435ad78762669f09fe675a7a8c5a0e409f37393a959ab2c1"} Feb 02 10:54:51 crc kubenswrapper[4845]: I0202 10:54:51.802273 4845 generic.go:334] "Generic (PLEG): container finished" podID="f45c3661-e66b-41a2-9a98-db215df0b2cf" containerID="45867aad16ddef2c9f5a1df92c6f8cab6d7f01caa36027be8a34aecad2ed798b" exitCode=0 Feb 02 10:54:51 crc kubenswrapper[4845]: I0202 10:54:51.802394 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mjh66" event={"ID":"f45c3661-e66b-41a2-9a98-db215df0b2cf","Type":"ContainerDied","Data":"45867aad16ddef2c9f5a1df92c6f8cab6d7f01caa36027be8a34aecad2ed798b"} Feb 02 10:54:51 crc kubenswrapper[4845]: I0202 10:54:51.805318 4845 generic.go:334] "Generic (PLEG): container finished" podID="7c902530-dc88-4300-9356-1f3938cfef4a" containerID="7bd7e7f5964f9f073fd42624c4ca749f932b5e04b1db26f0a1d5d0f87ecbdb1f" exitCode=0 Feb 02 10:54:51 crc kubenswrapper[4845]: I0202 10:54:51.805355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0d03-account-create-update-79bpm" event={"ID":"7c902530-dc88-4300-9356-1f3938cfef4a","Type":"ContainerDied","Data":"7bd7e7f5964f9f073fd42624c4ca749f932b5e04b1db26f0a1d5d0f87ecbdb1f"} Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.451076 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.458157 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.574002 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgpr\" (UniqueName: \"kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr\") pod \"7c902530-dc88-4300-9356-1f3938cfef4a\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.574234 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdfk\" (UniqueName: \"kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk\") pod \"f45c3661-e66b-41a2-9a98-db215df0b2cf\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.574396 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts\") pod \"f45c3661-e66b-41a2-9a98-db215df0b2cf\" (UID: \"f45c3661-e66b-41a2-9a98-db215df0b2cf\") " Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.574436 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts\") pod \"7c902530-dc88-4300-9356-1f3938cfef4a\" (UID: \"7c902530-dc88-4300-9356-1f3938cfef4a\") " Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.575470 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c902530-dc88-4300-9356-1f3938cfef4a" (UID: "7c902530-dc88-4300-9356-1f3938cfef4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.575525 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f45c3661-e66b-41a2-9a98-db215df0b2cf" (UID: "f45c3661-e66b-41a2-9a98-db215df0b2cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.579714 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr" (OuterVolumeSpecName: "kube-api-access-wfgpr") pod "7c902530-dc88-4300-9356-1f3938cfef4a" (UID: "7c902530-dc88-4300-9356-1f3938cfef4a"). InnerVolumeSpecName "kube-api-access-wfgpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.586174 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk" (OuterVolumeSpecName: "kube-api-access-2cdfk") pod "f45c3661-e66b-41a2-9a98-db215df0b2cf" (UID: "f45c3661-e66b-41a2-9a98-db215df0b2cf"). InnerVolumeSpecName "kube-api-access-2cdfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.677111 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgpr\" (UniqueName: \"kubernetes.io/projected/7c902530-dc88-4300-9356-1f3938cfef4a-kube-api-access-wfgpr\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.677152 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdfk\" (UniqueName: \"kubernetes.io/projected/f45c3661-e66b-41a2-9a98-db215df0b2cf-kube-api-access-2cdfk\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.677165 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f45c3661-e66b-41a2-9a98-db215df0b2cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.677177 4845 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c902530-dc88-4300-9356-1f3938cfef4a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.835283 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-mjh66" event={"ID":"f45c3661-e66b-41a2-9a98-db215df0b2cf","Type":"ContainerDied","Data":"e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216"} Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.835338 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6713628b33a9b6b74b19eb1bd6de884b5e4d7d9343c91923ecacc0283e77216" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.835402 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-mjh66" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.840384 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0d03-account-create-update-79bpm" event={"ID":"7c902530-dc88-4300-9356-1f3938cfef4a","Type":"ContainerDied","Data":"71cf5e6659540df9435ad78762669f09fe675a7a8c5a0e409f37393a959ab2c1"} Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.840511 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71cf5e6659540df9435ad78762669f09fe675a7a8c5a0e409f37393a959ab2c1" Feb 02 10:54:53 crc kubenswrapper[4845]: I0202 10:54:53.840482 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0d03-account-create-update-79bpm" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.695341 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710507 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710557 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710578 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710640 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v49d\" (UniqueName: \"kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710908 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.710989 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.711018 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data\") pod \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\" (UID: \"baf4fb99-ffd9-4c16-b115-bbcb46c01096\") " Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.711281 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.711382 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.712245 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.712266 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/baf4fb99-ffd9-4c16-b115-bbcb46c01096-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.718361 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d" (OuterVolumeSpecName: "kube-api-access-6v49d") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "kube-api-access-6v49d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.736105 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts" (OuterVolumeSpecName: "scripts") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.814227 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.814315 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v49d\" (UniqueName: \"kubernetes.io/projected/baf4fb99-ffd9-4c16-b115-bbcb46c01096-kube-api-access-6v49d\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.827007 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.832286 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.854052 4845 generic.go:334] "Generic (PLEG): container finished" podID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerID="9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c" exitCode=137 Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.854135 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.854153 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerDied","Data":"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c"} Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.854563 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"baf4fb99-ffd9-4c16-b115-bbcb46c01096","Type":"ContainerDied","Data":"2e307c045dfca6f5ca82d2578694038311524ca7900bb0337d06405f23ff2a24"} Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.854615 4845 scope.go:117] "RemoveContainer" containerID="9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.875782 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data" (OuterVolumeSpecName: "config-data") pod "baf4fb99-ffd9-4c16-b115-bbcb46c01096" (UID: "baf4fb99-ffd9-4c16-b115-bbcb46c01096"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.916737 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.917024 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.917100 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf4fb99-ffd9-4c16-b115-bbcb46c01096-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.939658 4845 scope.go:117] "RemoveContainer" containerID="dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.967946 4845 scope.go:117] "RemoveContainer" containerID="135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb" Feb 02 10:54:54 crc kubenswrapper[4845]: I0202 10:54:54.996693 4845 scope.go:117] "RemoveContainer" containerID="41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.026039 4845 scope.go:117] "RemoveContainer" containerID="9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.026525 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c\": container with ID starting with 9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c not found: ID does not exist" containerID="9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.026614 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c"} err="failed to get container status \"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c\": rpc error: code = NotFound desc = could not find container \"9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c\": container with ID starting with 9180d33c29a258d5e5f21165e0791183ddb999a02f1388b14f51db73e9a8e68c not found: ID does not exist" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.026639 4845 scope.go:117] "RemoveContainer" containerID="dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.026833 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef\": container with ID starting with dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef not found: ID does not exist" containerID="dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.026859 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef"} err="failed to get container status \"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef\": rpc error: code = NotFound desc = could not find container \"dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef\": container with ID starting with dd1e730df852af3cdad58ffb1d7d3871da56a6b229c97c97e197ccfa743aa0ef not found: ID does not exist" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.026872 4845 scope.go:117] "RemoveContainer" containerID="135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.031989 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb\": container with ID starting with 135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb not found: ID does not exist" containerID="135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.032035 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb"} err="failed to get container status \"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb\": rpc error: code = NotFound desc = could not find container \"135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb\": container with ID starting with 135668b391056cee1086648142ef74df147dac60de9fce729184fb01bbd638eb not found: ID does not exist" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.032056 4845 scope.go:117] "RemoveContainer" containerID="41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.032480 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448\": container with ID starting with 41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448 not found: ID does not exist" containerID="41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.032517 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448"} err="failed to get container status \"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448\": rpc error: code = NotFound desc = could not find container \"41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448\": container with ID starting with 41f2374677b587fdd24068f72bbd9d20bd2de324d32a42b0b98f76dd238c1448 not found: ID does not exist" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.207419 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.225418 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.242685 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243373 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c902530-dc88-4300-9356-1f3938cfef4a" containerName="mariadb-account-create-update" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243403 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c902530-dc88-4300-9356-1f3938cfef4a" containerName="mariadb-account-create-update" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243435 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="sg-core" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243442 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="sg-core" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243454 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-notification-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243460 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-notification-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243469 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f45c3661-e66b-41a2-9a98-db215df0b2cf" containerName="mariadb-database-create" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243475 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="f45c3661-e66b-41a2-9a98-db215df0b2cf" containerName="mariadb-database-create" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243489 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="proxy-httpd" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243495 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="proxy-httpd" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.243507 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-central-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243513 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-central-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243740 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="proxy-httpd" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243756 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-central-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243772 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c902530-dc88-4300-9356-1f3938cfef4a" containerName="mariadb-account-create-update" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243780 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="ceilometer-notification-agent" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243789 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" containerName="sg-core" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.243807 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="f45c3661-e66b-41a2-9a98-db215df0b2cf" containerName="mariadb-database-create" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.246361 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.249252 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.249396 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.266585 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326436 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326475 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326494 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326516 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326544 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bknng\" (UniqueName: \"kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326690 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.326974 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428479 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428658 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428690 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428714 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428745 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428784 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bknng\" (UniqueName: \"kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.428837 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.430356 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.430425 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.435013 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.435202 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.435266 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.435631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.453784 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bknng\" (UniqueName: \"kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng\") pod \"ceilometer-0\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: E0202 10:54:55.456935 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4430b5f_6421_41e2_b338_3b215c57957a.slice\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.587560 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:55 crc kubenswrapper[4845]: I0202 10:54:55.736409 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf4fb99-ffd9-4c16-b115-bbcb46c01096" path="/var/lib/kubelet/pods/baf4fb99-ffd9-4c16-b115-bbcb46c01096/volumes" Feb 02 10:54:56 crc kubenswrapper[4845]: I0202 10:54:56.183705 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:56 crc kubenswrapper[4845]: I0202 10:54:56.900459 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerStarted","Data":"4a2f52662f64ac4adafea886384b8aff903812b748685ed30b174caef644d27d"} Feb 02 10:54:57 crc kubenswrapper[4845]: I0202 10:54:57.912825 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerStarted","Data":"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0"} Feb 02 10:54:57 crc kubenswrapper[4845]: I0202 10:54:57.913549 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerStarted","Data":"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017"} Feb 02 10:54:58 crc kubenswrapper[4845]: I0202 10:54:58.933714 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerStarted","Data":"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358"} Feb 02 10:55:01 crc kubenswrapper[4845]: I0202 10:55:01.973571 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerStarted","Data":"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699"} Feb 02 10:55:01 crc kubenswrapper[4845]: I0202 10:55:01.974522 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:55:02 crc kubenswrapper[4845]: I0202 10:55:02.002067 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.372800197 podStartE2EDuration="7.002041463s" podCreationTimestamp="2026-02-02 10:54:55 +0000 UTC" firstStartedPulling="2026-02-02 10:54:56.153524148 +0000 UTC m=+1377.244925598" lastFinishedPulling="2026-02-02 10:55:00.782765414 +0000 UTC m=+1381.874166864" observedRunningTime="2026-02-02 10:55:01.999660744 +0000 UTC m=+1383.091062194" watchObservedRunningTime="2026-02-02 10:55:02.002041463 +0000 UTC m=+1383.093442913" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.028251 4845 generic.go:334] "Generic (PLEG): container finished" podID="08e1823d-46cd-40c5-bea1-162473f9a4ce" containerID="9eb0cc21db22e7b0a6e9194e496c67f804362485d557f665096269f5e604e637" exitCode=137 Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.028453 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b9b757468-zfd7s" event={"ID":"08e1823d-46cd-40c5-bea1-162473f9a4ce","Type":"ContainerDied","Data":"9eb0cc21db22e7b0a6e9194e496c67f804362485d557f665096269f5e604e637"} Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.028808 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b9b757468-zfd7s" event={"ID":"08e1823d-46cd-40c5-bea1-162473f9a4ce","Type":"ContainerDied","Data":"99d07802bcf1c0ba9e3fd5b262b073cf214a20ee31d4f7170adba1e4c7702081"} Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.028834 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d07802bcf1c0ba9e3fd5b262b073cf214a20ee31d4f7170adba1e4c7702081" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.030628 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.194444 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data\") pod \"08e1823d-46cd-40c5-bea1-162473f9a4ce\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.194854 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjss5\" (UniqueName: \"kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5\") pod \"08e1823d-46cd-40c5-bea1-162473f9a4ce\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.195002 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle\") pod \"08e1823d-46cd-40c5-bea1-162473f9a4ce\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.195087 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom\") pod \"08e1823d-46cd-40c5-bea1-162473f9a4ce\" (UID: \"08e1823d-46cd-40c5-bea1-162473f9a4ce\") " Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.200110 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "08e1823d-46cd-40c5-bea1-162473f9a4ce" (UID: "08e1823d-46cd-40c5-bea1-162473f9a4ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.200962 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5" (OuterVolumeSpecName: "kube-api-access-vjss5") pod "08e1823d-46cd-40c5-bea1-162473f9a4ce" (UID: "08e1823d-46cd-40c5-bea1-162473f9a4ce"). InnerVolumeSpecName "kube-api-access-vjss5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.225715 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08e1823d-46cd-40c5-bea1-162473f9a4ce" (UID: "08e1823d-46cd-40c5-bea1-162473f9a4ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.255645 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data" (OuterVolumeSpecName: "config-data") pod "08e1823d-46cd-40c5-bea1-162473f9a4ce" (UID: "08e1823d-46cd-40c5-bea1-162473f9a4ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.298809 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjss5\" (UniqueName: \"kubernetes.io/projected/08e1823d-46cd-40c5-bea1-162473f9a4ce-kube-api-access-vjss5\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.298851 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.298864 4845 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:07 crc kubenswrapper[4845]: I0202 10:55:07.299162 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08e1823d-46cd-40c5-bea1-162473f9a4ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:08 crc kubenswrapper[4845]: I0202 10:55:08.040483 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b9b757468-zfd7s" Feb 02 10:55:08 crc kubenswrapper[4845]: I0202 10:55:08.070110 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:55:08 crc kubenswrapper[4845]: I0202 10:55:08.088542 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-b9b757468-zfd7s"] Feb 02 10:55:09 crc kubenswrapper[4845]: I0202 10:55:09.726840 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08e1823d-46cd-40c5-bea1-162473f9a4ce" path="/var/lib/kubelet/pods/08e1823d-46cd-40c5-bea1-162473f9a4ce/volumes" Feb 02 10:55:14 crc kubenswrapper[4845]: I0202 10:55:14.112117 4845 generic.go:334] "Generic (PLEG): container finished" podID="cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" containerID="4bd5dbe3f7a8b7903c6ee652f72c588239fb784e24ebb2f47f5ca3b9452668fa" exitCode=0 Feb 02 10:55:14 crc kubenswrapper[4845]: I0202 10:55:14.112211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" event={"ID":"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff","Type":"ContainerDied","Data":"4bd5dbe3f7a8b7903c6ee652f72c588239fb784e24ebb2f47f5ca3b9452668fa"} Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.519916 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.610494 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle\") pod \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.610663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkklk\" (UniqueName: \"kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk\") pod \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.610696 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data\") pod \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.610727 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts\") pod \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\" (UID: \"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff\") " Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.615924 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts" (OuterVolumeSpecName: "scripts") pod "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" (UID: "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.616965 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk" (OuterVolumeSpecName: "kube-api-access-fkklk") pod "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" (UID: "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff"). InnerVolumeSpecName "kube-api-access-fkklk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.642830 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" (UID: "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.646080 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data" (OuterVolumeSpecName: "config-data") pod "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" (UID: "cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.713046 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkklk\" (UniqueName: \"kubernetes.io/projected/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-kube-api-access-fkklk\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.713275 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.713384 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4845]: I0202 10:55:15.713455 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.134689 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" event={"ID":"cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff","Type":"ContainerDied","Data":"7649afe14ec5ab0dbef279c4cc985147cdb3a346b125d477468bb7cabfb65001"} Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.134979 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7649afe14ec5ab0dbef279c4cc985147cdb3a346b125d477468bb7cabfb65001" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.134820 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qwpzq" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.315689 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:55:16 crc kubenswrapper[4845]: E0202 10:55:16.316278 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08e1823d-46cd-40c5-bea1-162473f9a4ce" containerName="heat-cfnapi" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.316305 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="08e1823d-46cd-40c5-bea1-162473f9a4ce" containerName="heat-cfnapi" Feb 02 10:55:16 crc kubenswrapper[4845]: E0202 10:55:16.316329 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" containerName="nova-cell0-conductor-db-sync" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.316339 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" containerName="nova-cell0-conductor-db-sync" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.316612 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" containerName="nova-cell0-conductor-db-sync" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.316676 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="08e1823d-46cd-40c5-bea1-162473f9a4ce" containerName="heat-cfnapi" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.317522 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.319117 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-8lh6g" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.319602 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.339661 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.429125 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.429293 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zs6\" (UniqueName: \"kubernetes.io/projected/af5e3b4b-9a44-4b50-8799-71f869de9028-kube-api-access-q6zs6\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.429327 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.531992 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.532171 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zs6\" (UniqueName: \"kubernetes.io/projected/af5e3b4b-9a44-4b50-8799-71f869de9028-kube-api-access-q6zs6\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.532208 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.537455 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.542077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af5e3b4b-9a44-4b50-8799-71f869de9028-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.557323 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zs6\" (UniqueName: \"kubernetes.io/projected/af5e3b4b-9a44-4b50-8799-71f869de9028-kube-api-access-q6zs6\") pod \"nova-cell0-conductor-0\" (UID: \"af5e3b4b-9a44-4b50-8799-71f869de9028\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:16 crc kubenswrapper[4845]: I0202 10:55:16.642945 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:17 crc kubenswrapper[4845]: I0202 10:55:17.146412 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:55:18 crc kubenswrapper[4845]: I0202 10:55:18.160649 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af5e3b4b-9a44-4b50-8799-71f869de9028","Type":"ContainerStarted","Data":"2d72ecf8fc97f76106c058686a64dffee6b2ab0e9293464960787ca9a2163690"} Feb 02 10:55:18 crc kubenswrapper[4845]: I0202 10:55:18.161010 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"af5e3b4b-9a44-4b50-8799-71f869de9028","Type":"ContainerStarted","Data":"d8a9f03eec02b52248ed5434c4d3a71e0901cf641c7aee807f3852b772bc941f"} Feb 02 10:55:18 crc kubenswrapper[4845]: I0202 10:55:18.162660 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:18 crc kubenswrapper[4845]: I0202 10:55:18.187561 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.187543767 podStartE2EDuration="2.187543767s" podCreationTimestamp="2026-02-02 10:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:18.180639398 +0000 UTC m=+1399.272040858" watchObservedRunningTime="2026-02-02 10:55:18.187543767 +0000 UTC m=+1399.278945217" Feb 02 10:55:25 crc kubenswrapper[4845]: I0202 10:55:25.598684 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:55:26 crc kubenswrapper[4845]: I0202 10:55:26.714452 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.142431 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-725jn"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.144005 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.147139 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.155159 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.168301 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-725jn"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.215935 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.215985 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.216015 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-282l4\" (UniqueName: \"kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.216325 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.321693 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.321905 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.321959 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.322015 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-282l4\" (UniqueName: \"kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.340332 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.343320 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-282l4\" (UniqueName: \"kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.348911 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.349766 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-725jn\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.423007 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.428700 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.437642 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.479860 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.489653 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.491782 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.501608 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.524116 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554405 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchv8\" (UniqueName: \"kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554559 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554648 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgljn\" (UniqueName: \"kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554823 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554878 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.554967 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.555209 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.555437 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657549 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gchv8\" (UniqueName: \"kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657638 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657689 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgljn\" (UniqueName: \"kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657812 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657837 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.657972 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.667147 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.677205 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.678704 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.688519 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.688627 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.690618 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.696475 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.696855 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.709440 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchv8\" (UniqueName: \"kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8\") pod \"nova-scheduler-0\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.725062 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgljn\" (UniqueName: \"kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn\") pod \"nova-api-0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.763396 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.767011 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.774483 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.774837 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.775386 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dl5\" (UniqueName: \"kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.791242 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.791314 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.803366 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.804720 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.810772 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.839382 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.864362 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.879180 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.879348 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.879407 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.879538 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dl5\" (UniqueName: \"kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.890589 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.890957 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.904491 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dl5\" (UniqueName: \"kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.911351 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data\") pod \"nova-metadata-0\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " pod="openstack/nova-metadata-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.948548 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:27 crc kubenswrapper[4845]: I0202 10:55:27.997964 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.011946 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.012011 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.013144 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.013254 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.013817 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f27jh\" (UniqueName: \"kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.013848 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7gzv\" (UniqueName: \"kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.014242 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.014565 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.014622 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117314 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f27jh\" (UniqueName: \"kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117354 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7gzv\" (UniqueName: \"kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117395 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117433 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117450 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117499 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117521 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117556 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.117586 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.119016 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.119812 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.122819 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.123970 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.124455 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.124702 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.125395 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.130057 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.149320 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f27jh\" (UniqueName: \"kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh\") pod \"dnsmasq-dns-9b86998b5-hvdzc\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.155148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7gzv\" (UniqueName: \"kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.225732 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.311861 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-725jn"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.444341 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.603572 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.753088 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r4lj7"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.754828 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.760380 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.760652 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.785431 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r4lj7"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.853865 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.854164 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7jh8\" (UniqueName: \"kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.855713 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.856265 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.884962 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.903210 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.973617 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.973716 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7jh8\" (UniqueName: \"kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.973800 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:28 crc kubenswrapper[4845]: I0202 10:55:28.973926 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:28.998234 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:28.999485 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.003146 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.029612 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7jh8\" (UniqueName: \"kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8\") pod \"nova-cell1-conductor-db-sync-r4lj7\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.091832 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.291723 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.294611 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerStarted","Data":"796d16b6c2b2f99942af8c650f6fd9074569009a172c9c4df05a46809ca9e640"} Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.307811 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerStarted","Data":"f57f9e17a253f3027639c6870894363d5912e86c0ad65390204a822d4d8ff33a"} Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.316266 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"516e4c98-314a-4116-b0fc-45c18fd1c7e1","Type":"ContainerStarted","Data":"ad10cf81d533e8b37ef72f4ca4f01fc9f32600963e1feb8c42617710f205cfdd"} Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.318712 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-725jn" event={"ID":"7439e987-75e8-4cc8-840a-742c6f07dea9","Type":"ContainerStarted","Data":"7ebfe4b502f94b649860108409e8e9b93586764606d6f176206def30cf86a61d"} Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.318739 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-725jn" event={"ID":"7439e987-75e8-4cc8-840a-742c6f07dea9","Type":"ContainerStarted","Data":"b69a35d9b6473cf3a175810e25cf4dd4eb5c8693efc5afa386457c19bc88440b"} Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.373551 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-725jn" podStartSLOduration=2.373523951 podStartE2EDuration="2.373523951s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:29.345183175 +0000 UTC m=+1410.436584645" watchObservedRunningTime="2026-02-02 10:55:29.373523951 +0000 UTC m=+1410.464925401" Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.445694 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:55:29 crc kubenswrapper[4845]: I0202 10:55:29.876651 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r4lj7"] Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.339141 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98","Type":"ContainerStarted","Data":"fe04f70476c707b2543738532ef74000620c8c2924f735fde10bff5d95053cb3"} Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.353319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" event={"ID":"7b2bad3a-8153-41d8-83f6-9f9caa16589b","Type":"ContainerStarted","Data":"f926f2a33b190bee678900de2f6fcfec869f9b6aacd707a0eedb2404459e01e7"} Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.353367 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" event={"ID":"7b2bad3a-8153-41d8-83f6-9f9caa16589b","Type":"ContainerStarted","Data":"da194489804af83cf7bd974e17436580f313625e8bbc411b3359df8752a517c2"} Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.387265 4845 generic.go:334] "Generic (PLEG): container finished" podID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerID="f824758afe3c663578c7666b1f094db1caee520920ba737c52e548b07f3ee9ad" exitCode=0 Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.387409 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" event={"ID":"83cd6f6d-3615-46e0-875a-e1cec10e9631","Type":"ContainerDied","Data":"f824758afe3c663578c7666b1f094db1caee520920ba737c52e548b07f3ee9ad"} Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.387474 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" event={"ID":"83cd6f6d-3615-46e0-875a-e1cec10e9631","Type":"ContainerStarted","Data":"5459dadbda58e0ce878baeea7244a3a46fe1a38ff8b8032f5afcf3e9f7c8bd0d"} Feb 02 10:55:30 crc kubenswrapper[4845]: I0202 10:55:30.390278 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" podStartSLOduration=2.390254938 podStartE2EDuration="2.390254938s" podCreationTimestamp="2026-02-02 10:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:30.379440086 +0000 UTC m=+1411.470841536" watchObservedRunningTime="2026-02-02 10:55:30.390254938 +0000 UTC m=+1411.481656388" Feb 02 10:55:31 crc kubenswrapper[4845]: I0202 10:55:31.298346 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:55:31 crc kubenswrapper[4845]: I0202 10:55:31.330958 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:31 crc kubenswrapper[4845]: I0202 10:55:31.403957 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" event={"ID":"83cd6f6d-3615-46e0-875a-e1cec10e9631","Type":"ContainerStarted","Data":"4588a5acb74ef82abd161d26d28f375cb1f56efac97a86455f032c5662e7188e"} Feb 02 10:55:31 crc kubenswrapper[4845]: I0202 10:55:31.404530 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:31 crc kubenswrapper[4845]: I0202 10:55:31.427141 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" podStartSLOduration=4.427120114 podStartE2EDuration="4.427120114s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:31.424989893 +0000 UTC m=+1412.516391343" watchObservedRunningTime="2026-02-02 10:55:31.427120114 +0000 UTC m=+1412.518521564" Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.434947 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98","Type":"ContainerStarted","Data":"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.435071 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad" gracePeriod=30 Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.437573 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"516e4c98-314a-4116-b0fc-45c18fd1c7e1","Type":"ContainerStarted","Data":"86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.440167 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerStarted","Data":"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.440212 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-log" containerID="cri-o://e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" gracePeriod=30 Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.440222 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-metadata" containerID="cri-o://3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" gracePeriod=30 Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.440230 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerStarted","Data":"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.449160 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerStarted","Data":"7371036457c908a60d98a53f34d54f0d70618efbebcf97fbc3e3f1b041ae7110"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.449213 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerStarted","Data":"54419ef52d04b18470afc9cd9fe1e7776568928b396efb56b1f79342767e7b05"} Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.475325 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.707314967 podStartE2EDuration="7.475300974s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="2026-02-02 10:55:29.592107325 +0000 UTC m=+1410.683508775" lastFinishedPulling="2026-02-02 10:55:33.360093342 +0000 UTC m=+1414.451494782" observedRunningTime="2026-02-02 10:55:34.463385021 +0000 UTC m=+1415.554786471" watchObservedRunningTime="2026-02-02 10:55:34.475300974 +0000 UTC m=+1415.566702434" Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.498504 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.121617453 podStartE2EDuration="7.498477572s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="2026-02-02 10:55:28.955381072 +0000 UTC m=+1410.046782522" lastFinishedPulling="2026-02-02 10:55:33.332241201 +0000 UTC m=+1414.423642641" observedRunningTime="2026-02-02 10:55:34.484433727 +0000 UTC m=+1415.575835177" watchObservedRunningTime="2026-02-02 10:55:34.498477572 +0000 UTC m=+1415.589879022" Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.511291 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.830300344 podStartE2EDuration="7.51127178s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="2026-02-02 10:55:28.64208043 +0000 UTC m=+1409.733481880" lastFinishedPulling="2026-02-02 10:55:33.323051866 +0000 UTC m=+1414.414453316" observedRunningTime="2026-02-02 10:55:34.504755523 +0000 UTC m=+1415.596156973" watchObservedRunningTime="2026-02-02 10:55:34.51127178 +0000 UTC m=+1415.602673230" Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.529049 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.1610733890000002 podStartE2EDuration="7.529025701s" podCreationTimestamp="2026-02-02 10:55:27 +0000 UTC" firstStartedPulling="2026-02-02 10:55:28.955502045 +0000 UTC m=+1410.046903495" lastFinishedPulling="2026-02-02 10:55:33.323454357 +0000 UTC m=+1414.414855807" observedRunningTime="2026-02-02 10:55:34.5199662 +0000 UTC m=+1415.611367650" watchObservedRunningTime="2026-02-02 10:55:34.529025701 +0000 UTC m=+1415.620427151" Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.892466 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:34 crc kubenswrapper[4845]: I0202 10:55:34.893413 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" containerName="kube-state-metrics" containerID="cri-o://cacef27d0dbc8696a18778de5ec2dfe7815ec16e5c7a5beb939fff7f7c4e5a61" gracePeriod=30 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.128130 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.128603 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" containerName="mysqld-exporter" containerID="cri-o://995f2b7a30c667d098b78f0a5f78fb72fb30f1f5da96bbc6a50e3a4f536e40bb" gracePeriod=30 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.240790 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.290534 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs\") pod \"aca5ade4-10bb-4dc2-83c9-546c778230b1\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.290641 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle\") pod \"aca5ade4-10bb-4dc2-83c9-546c778230b1\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.290673 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data\") pod \"aca5ade4-10bb-4dc2-83c9-546c778230b1\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.290754 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dl5\" (UniqueName: \"kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5\") pod \"aca5ade4-10bb-4dc2-83c9-546c778230b1\" (UID: \"aca5ade4-10bb-4dc2-83c9-546c778230b1\") " Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.291229 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs" (OuterVolumeSpecName: "logs") pod "aca5ade4-10bb-4dc2-83c9-546c778230b1" (UID: "aca5ade4-10bb-4dc2-83c9-546c778230b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.291857 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aca5ade4-10bb-4dc2-83c9-546c778230b1-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.298501 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5" (OuterVolumeSpecName: "kube-api-access-t9dl5") pod "aca5ade4-10bb-4dc2-83c9-546c778230b1" (UID: "aca5ade4-10bb-4dc2-83c9-546c778230b1"). InnerVolumeSpecName "kube-api-access-t9dl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.338166 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aca5ade4-10bb-4dc2-83c9-546c778230b1" (UID: "aca5ade4-10bb-4dc2-83c9-546c778230b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.350864 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data" (OuterVolumeSpecName: "config-data") pod "aca5ade4-10bb-4dc2-83c9-546c778230b1" (UID: "aca5ade4-10bb-4dc2-83c9-546c778230b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.395997 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.396031 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aca5ade4-10bb-4dc2-83c9-546c778230b1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.396043 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9dl5\" (UniqueName: \"kubernetes.io/projected/aca5ade4-10bb-4dc2-83c9-546c778230b1-kube-api-access-t9dl5\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495787 4845 generic.go:334] "Generic (PLEG): container finished" podID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerID="3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" exitCode=0 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495825 4845 generic.go:334] "Generic (PLEG): container finished" podID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerID="e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" exitCode=143 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495915 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerDied","Data":"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4"} Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495951 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerDied","Data":"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8"} Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495964 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"aca5ade4-10bb-4dc2-83c9-546c778230b1","Type":"ContainerDied","Data":"796d16b6c2b2f99942af8c650f6fd9074569009a172c9c4df05a46809ca9e640"} Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.495981 4845 scope.go:117] "RemoveContainer" containerID="3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.496157 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.512243 4845 generic.go:334] "Generic (PLEG): container finished" podID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" containerID="995f2b7a30c667d098b78f0a5f78fb72fb30f1f5da96bbc6a50e3a4f536e40bb" exitCode=2 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.512337 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ada4f3a2-2715-4c0c-bc32-5c488a2e1996","Type":"ContainerDied","Data":"995f2b7a30c667d098b78f0a5f78fb72fb30f1f5da96bbc6a50e3a4f536e40bb"} Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.531160 4845 generic.go:334] "Generic (PLEG): container finished" podID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" containerID="cacef27d0dbc8696a18778de5ec2dfe7815ec16e5c7a5beb939fff7f7c4e5a61" exitCode=2 Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.532456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31","Type":"ContainerDied","Data":"cacef27d0dbc8696a18778de5ec2dfe7815ec16e5c7a5beb939fff7f7c4e5a61"} Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.603078 4845 scope.go:117] "RemoveContainer" containerID="e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.610508 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.665777 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.762535 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" path="/var/lib/kubelet/pods/aca5ade4-10bb-4dc2-83c9-546c778230b1/volumes" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.764649 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:35 crc kubenswrapper[4845]: E0202 10:55:35.765140 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-log" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.765159 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-log" Feb 02 10:55:35 crc kubenswrapper[4845]: E0202 10:55:35.765191 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-metadata" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.765201 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-metadata" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.765489 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-log" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.765522 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="aca5ade4-10bb-4dc2-83c9-546c778230b1" containerName="nova-metadata-metadata" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.771196 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.772860 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.777761 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.779374 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.785380 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.877298 4845 scope.go:117] "RemoveContainer" containerID="3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" Feb 02 10:55:35 crc kubenswrapper[4845]: E0202 10:55:35.881317 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4\": container with ID starting with 3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4 not found: ID does not exist" containerID="3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.881376 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4"} err="failed to get container status \"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4\": rpc error: code = NotFound desc = could not find container \"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4\": container with ID starting with 3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4 not found: ID does not exist" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.881406 4845 scope.go:117] "RemoveContainer" containerID="e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" Feb 02 10:55:35 crc kubenswrapper[4845]: E0202 10:55:35.885396 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8\": container with ID starting with e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8 not found: ID does not exist" containerID="e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.885434 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8"} err="failed to get container status \"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8\": rpc error: code = NotFound desc = could not find container \"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8\": container with ID starting with e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8 not found: ID does not exist" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.885461 4845 scope.go:117] "RemoveContainer" containerID="3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.885665 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4"} err="failed to get container status \"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4\": rpc error: code = NotFound desc = could not find container \"3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4\": container with ID starting with 3b4016fad380e8d5250c22e00f2cccbbc1b2ac3899c47e7955ec48e2348141c4 not found: ID does not exist" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.885679 4845 scope.go:117] "RemoveContainer" containerID="e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.885844 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8"} err="failed to get container status \"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8\": rpc error: code = NotFound desc = could not find container \"e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8\": container with ID starting with e096766b4b6ca60fad6f519a167619715093a61239745550dfa1ef9860d2d4a8 not found: ID does not exist" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.933373 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr6hd\" (UniqueName: \"kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd\") pod \"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31\" (UID: \"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31\") " Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.935138 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.935214 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rlf\" (UniqueName: \"kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.935635 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.936029 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.936154 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:35 crc kubenswrapper[4845]: I0202 10:55:35.953692 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd" (OuterVolumeSpecName: "kube-api-access-pr6hd") pod "a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" (UID: "a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31"). InnerVolumeSpecName "kube-api-access-pr6hd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.043615 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.043723 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.043750 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rlf\" (UniqueName: \"kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.043844 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.043955 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.044042 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pr6hd\" (UniqueName: \"kubernetes.io/projected/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31-kube-api-access-pr6hd\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.044689 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.050498 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.051149 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.052212 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.062749 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rlf\" (UniqueName: \"kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf\") pod \"nova-metadata-0\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.063060 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.104744 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.260757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle\") pod \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.260867 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data\") pod \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.261016 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zjmd\" (UniqueName: \"kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd\") pod \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\" (UID: \"ada4f3a2-2715-4c0c-bc32-5c488a2e1996\") " Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.275124 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd" (OuterVolumeSpecName: "kube-api-access-5zjmd") pod "ada4f3a2-2715-4c0c-bc32-5c488a2e1996" (UID: "ada4f3a2-2715-4c0c-bc32-5c488a2e1996"). InnerVolumeSpecName "kube-api-access-5zjmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.343154 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada4f3a2-2715-4c0c-bc32-5c488a2e1996" (UID: "ada4f3a2-2715-4c0c-bc32-5c488a2e1996"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.364908 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.364946 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zjmd\" (UniqueName: \"kubernetes.io/projected/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-kube-api-access-5zjmd\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.385200 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data" (OuterVolumeSpecName: "config-data") pod "ada4f3a2-2715-4c0c-bc32-5c488a2e1996" (UID: "ada4f3a2-2715-4c0c-bc32-5c488a2e1996"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.467042 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada4f3a2-2715-4c0c-bc32-5c488a2e1996-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.544340 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31","Type":"ContainerDied","Data":"e2f9a94d7e921b062f55ff2afa3f02c589bdda200432ed3b8b900c15b062f04e"} Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.544380 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.544427 4845 scope.go:117] "RemoveContainer" containerID="cacef27d0dbc8696a18778de5ec2dfe7815ec16e5c7a5beb939fff7f7c4e5a61" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.558169 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"ada4f3a2-2715-4c0c-bc32-5c488a2e1996","Type":"ContainerDied","Data":"4f3a714246c19659b6b750ef01e05f53d25576adc8d172e3db972b276255f379"} Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.558340 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.589792 4845 scope.go:117] "RemoveContainer" containerID="995f2b7a30c667d098b78f0a5f78fb72fb30f1f5da96bbc6a50e3a4f536e40bb" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.590188 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.631541 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.646447 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.656522 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.666667 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: E0202 10:55:36.667575 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" containerName="mysqld-exporter" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.667608 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" containerName="mysqld-exporter" Feb 02 10:55:36 crc kubenswrapper[4845]: E0202 10:55:36.667629 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" containerName="kube-state-metrics" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.667637 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" containerName="kube-state-metrics" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.667904 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" containerName="kube-state-metrics" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.667942 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" containerName="mysqld-exporter" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.689123 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.692535 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.692634 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.696031 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.696327 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.696590 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.696762 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.696848 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.708044 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.728812 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885456 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885565 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885623 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v5g6\" (UniqueName: \"kubernetes.io/projected/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-api-access-7v5g6\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885679 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-config-data\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885828 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4h9q\" (UniqueName: \"kubernetes.io/projected/6704fdd3-f589-4ccd-9a52-4a914e219b09-kube-api-access-k4h9q\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.885922 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.887705 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.887778 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.989970 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4h9q\" (UniqueName: \"kubernetes.io/projected/6704fdd3-f589-4ccd-9a52-4a914e219b09-kube-api-access-k4h9q\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990044 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990064 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990096 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990138 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990187 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v5g6\" (UniqueName: \"kubernetes.io/projected/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-api-access-7v5g6\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.990266 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-config-data\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.994670 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-config-data\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.994987 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.995650 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:36 crc kubenswrapper[4845]: I0202 10:55:36.995773 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6704fdd3-f589-4ccd-9a52-4a914e219b09-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.002428 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.003232 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.018986 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v5g6\" (UniqueName: \"kubernetes.io/projected/f70412f5-a824-45b2-92c2-8e37a25d540a-kube-api-access-7v5g6\") pod \"kube-state-metrics-0\" (UID: \"f70412f5-a824-45b2-92c2-8e37a25d540a\") " pod="openstack/kube-state-metrics-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.026587 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4h9q\" (UniqueName: \"kubernetes.io/projected/6704fdd3-f589-4ccd-9a52-4a914e219b09-kube-api-access-k4h9q\") pod \"mysqld-exporter-0\" (UID: \"6704fdd3-f589-4ccd-9a52-4a914e219b09\") " pod="openstack/mysqld-exporter-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.035341 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.112085 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.598979 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerStarted","Data":"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587"} Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.599300 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerStarted","Data":"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34"} Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.599319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerStarted","Data":"3d15c9909f7d9d3d544fcb74bd77649d5815fc4550664c3c132f452b9ce91981"} Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.627483 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.631656 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.63164044 podStartE2EDuration="2.63164044s" podCreationTimestamp="2026-02-02 10:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:37.628317744 +0000 UTC m=+1418.719719214" watchObservedRunningTime="2026-02-02 10:55:37.63164044 +0000 UTC m=+1418.723041880" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.735300 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31" path="/var/lib/kubelet/pods/a7c0daea-a5c7-4695-bd4f-ad9a3aaf7d31/volumes" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.737618 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada4f3a2-2715-4c0c-bc32-5c488a2e1996" path="/var/lib/kubelet/pods/ada4f3a2-2715-4c0c-bc32-5c488a2e1996/volumes" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.746558 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.767470 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.767527 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.949858 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.949930 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:55:37 crc kubenswrapper[4845]: I0202 10:55:37.983739 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.228148 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.307599 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.307836 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="dnsmasq-dns" containerID="cri-o://f6562082a819ade2f17a46123a0743170963965766d739159cebc380dae9a85f" gracePeriod=10 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.446236 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.469876 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.470268 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-central-agent" containerID="cri-o://59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017" gracePeriod=30 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.470627 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="proxy-httpd" containerID="cri-o://c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699" gracePeriod=30 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.470794 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-notification-agent" containerID="cri-o://db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0" gracePeriod=30 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.471041 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="sg-core" containerID="cri-o://66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358" gracePeriod=30 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.638499 4845 generic.go:334] "Generic (PLEG): container finished" podID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerID="66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358" exitCode=2 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.638577 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerDied","Data":"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358"} Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.640618 4845 generic.go:334] "Generic (PLEG): container finished" podID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerID="f6562082a819ade2f17a46123a0743170963965766d739159cebc380dae9a85f" exitCode=0 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.640674 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" event={"ID":"2ab561fd-1cd4-43c4-a09d-401ca966b4bb","Type":"ContainerDied","Data":"f6562082a819ade2f17a46123a0743170963965766d739159cebc380dae9a85f"} Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.645480 4845 generic.go:334] "Generic (PLEG): container finished" podID="7439e987-75e8-4cc8-840a-742c6f07dea9" containerID="7ebfe4b502f94b649860108409e8e9b93586764606d6f176206def30cf86a61d" exitCode=0 Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.645546 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-725jn" event={"ID":"7439e987-75e8-4cc8-840a-742c6f07dea9","Type":"ContainerDied","Data":"7ebfe4b502f94b649860108409e8e9b93586764606d6f176206def30cf86a61d"} Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.648654 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6704fdd3-f589-4ccd-9a52-4a914e219b09","Type":"ContainerStarted","Data":"eb6f676c13c56bf650964bf2919685e2373ea49300b6130e065763a1f64a68f1"} Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.651776 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f70412f5-a824-45b2-92c2-8e37a25d540a","Type":"ContainerStarted","Data":"4e42d1ddcc432477461c19616063b7119fe01ff4bd6ebccbbf31061234415ace"} Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.695595 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.854166 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.240:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:55:38 crc kubenswrapper[4845]: I0202 10:55:38.854757 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.240:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.020942 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048065 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048135 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048193 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048335 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048391 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.048471 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcn2\" (UniqueName: \"kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2\") pod \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\" (UID: \"2ab561fd-1cd4-43c4-a09d-401ca966b4bb\") " Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.053471 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2" (OuterVolumeSpecName: "kube-api-access-sbcn2") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "kube-api-access-sbcn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.152388 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcn2\" (UniqueName: \"kubernetes.io/projected/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-kube-api-access-sbcn2\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.286170 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.303003 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config" (OuterVolumeSpecName: "config") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.304673 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.306657 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.318068 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ab561fd-1cd4-43c4-a09d-401ca966b4bb" (UID: "2ab561fd-1cd4-43c4-a09d-401ca966b4bb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.358173 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.358213 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.358224 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.358235 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.358249 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2ab561fd-1cd4-43c4-a09d-401ca966b4bb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.665667 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.665692 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-nh2sl" event={"ID":"2ab561fd-1cd4-43c4-a09d-401ca966b4bb","Type":"ContainerDied","Data":"250cda2944b48f437289aaa5905e90991c9b8dc1b5cb5593f83b10af7cbf343a"} Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.665767 4845 scope.go:117] "RemoveContainer" containerID="f6562082a819ade2f17a46123a0743170963965766d739159cebc380dae9a85f" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.667409 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"6704fdd3-f589-4ccd-9a52-4a914e219b09","Type":"ContainerStarted","Data":"7bee02cd6eb60a2be51967ead29a3f60c6e67e4f6cb4d15d685a83a6496ac623"} Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.670147 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.670877 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f70412f5-a824-45b2-92c2-8e37a25d540a","Type":"ContainerStarted","Data":"0a5ef7131a1aa86b3afe64adf7aee4c9bd44c4d90ab9f37331cecd55bf74302a"} Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.676047 4845 generic.go:334] "Generic (PLEG): container finished" podID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerID="c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699" exitCode=0 Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.676085 4845 generic.go:334] "Generic (PLEG): container finished" podID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerID="59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017" exitCode=0 Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.676266 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerDied","Data":"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699"} Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.676301 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerDied","Data":"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017"} Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.693434 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.322513867 podStartE2EDuration="3.693415308s" podCreationTimestamp="2026-02-02 10:55:36 +0000 UTC" firstStartedPulling="2026-02-02 10:55:37.743951773 +0000 UTC m=+1418.835353223" lastFinishedPulling="2026-02-02 10:55:38.114853224 +0000 UTC m=+1419.206254664" observedRunningTime="2026-02-02 10:55:39.69036926 +0000 UTC m=+1420.781770730" watchObservedRunningTime="2026-02-02 10:55:39.693415308 +0000 UTC m=+1420.784816758" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.710227 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.009824553 podStartE2EDuration="3.710207311s" podCreationTimestamp="2026-02-02 10:55:36 +0000 UTC" firstStartedPulling="2026-02-02 10:55:37.633223565 +0000 UTC m=+1418.724625015" lastFinishedPulling="2026-02-02 10:55:38.333606323 +0000 UTC m=+1419.425007773" observedRunningTime="2026-02-02 10:55:39.706296659 +0000 UTC m=+1420.797698109" watchObservedRunningTime="2026-02-02 10:55:39.710207311 +0000 UTC m=+1420.801608761" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.728993 4845 scope.go:117] "RemoveContainer" containerID="47eae6a3cd68dd92b0b808c46cdd7b757872d2c3608874a20759716d81ea4849" Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.741125 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:55:39 crc kubenswrapper[4845]: I0202 10:55:39.752905 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-nh2sl"] Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.240660 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.382053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-282l4\" (UniqueName: \"kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4\") pod \"7439e987-75e8-4cc8-840a-742c6f07dea9\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.382156 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data\") pod \"7439e987-75e8-4cc8-840a-742c6f07dea9\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.382237 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts\") pod \"7439e987-75e8-4cc8-840a-742c6f07dea9\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.382319 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle\") pod \"7439e987-75e8-4cc8-840a-742c6f07dea9\" (UID: \"7439e987-75e8-4cc8-840a-742c6f07dea9\") " Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.391042 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts" (OuterVolumeSpecName: "scripts") pod "7439e987-75e8-4cc8-840a-742c6f07dea9" (UID: "7439e987-75e8-4cc8-840a-742c6f07dea9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.418409 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4" (OuterVolumeSpecName: "kube-api-access-282l4") pod "7439e987-75e8-4cc8-840a-742c6f07dea9" (UID: "7439e987-75e8-4cc8-840a-742c6f07dea9"). InnerVolumeSpecName "kube-api-access-282l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.426601 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data" (OuterVolumeSpecName: "config-data") pod "7439e987-75e8-4cc8-840a-742c6f07dea9" (UID: "7439e987-75e8-4cc8-840a-742c6f07dea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.454416 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7439e987-75e8-4cc8-840a-742c6f07dea9" (UID: "7439e987-75e8-4cc8-840a-742c6f07dea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.485479 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.485510 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.485523 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-282l4\" (UniqueName: \"kubernetes.io/projected/7439e987-75e8-4cc8-840a-742c6f07dea9-kube-api-access-282l4\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.485533 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7439e987-75e8-4cc8-840a-742c6f07dea9-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.694489 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-725jn" event={"ID":"7439e987-75e8-4cc8-840a-742c6f07dea9","Type":"ContainerDied","Data":"b69a35d9b6473cf3a175810e25cf4dd4eb5c8693efc5afa386457c19bc88440b"} Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.694898 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69a35d9b6473cf3a175810e25cf4dd4eb5c8693efc5afa386457c19bc88440b" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.694987 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-725jn" Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.701751 4845 generic.go:334] "Generic (PLEG): container finished" podID="7b2bad3a-8153-41d8-83f6-9f9caa16589b" containerID="f926f2a33b190bee678900de2f6fcfec869f9b6aacd707a0eedb2404459e01e7" exitCode=0 Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.701834 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" event={"ID":"7b2bad3a-8153-41d8-83f6-9f9caa16589b","Type":"ContainerDied","Data":"f926f2a33b190bee678900de2f6fcfec869f9b6aacd707a0eedb2404459e01e7"} Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.860624 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.860985 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-api" containerID="cri-o://7371036457c908a60d98a53f34d54f0d70618efbebcf97fbc3e3f1b041ae7110" gracePeriod=30 Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.861000 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-log" containerID="cri-o://54419ef52d04b18470afc9cd9fe1e7776568928b396efb56b1f79342767e7b05" gracePeriod=30 Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.883069 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.883396 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerName="nova-scheduler-scheduler" containerID="cri-o://86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" gracePeriod=30 Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.897624 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.897942 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-log" containerID="cri-o://dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" gracePeriod=30 Feb 02 10:55:40 crc kubenswrapper[4845]: I0202 10:55:40.898038 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-metadata" containerID="cri-o://5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" gracePeriod=30 Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.105120 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.105206 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.698714 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.705399 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.727962 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" path="/var/lib/kubelet/pods/2ab561fd-1cd4-43c4-a09d-401ca966b4bb/volumes" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.729925 4845 generic.go:334] "Generic (PLEG): container finished" podID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerID="5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" exitCode=0 Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.729957 4845 generic.go:334] "Generic (PLEG): container finished" podID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerID="dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" exitCode=143 Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.730081 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.734096 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerDied","Data":"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.734138 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerDied","Data":"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.734149 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"62eefe51-d633-47bd-b7b8-1b786cc8bdde","Type":"ContainerDied","Data":"3d15c9909f7d9d3d544fcb74bd77649d5815fc4550664c3c132f452b9ce91981"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.734175 4845 scope.go:117] "RemoveContainer" containerID="5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.756816 4845 generic.go:334] "Generic (PLEG): container finished" podID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerID="54419ef52d04b18470afc9cd9fe1e7776568928b396efb56b1f79342767e7b05" exitCode=143 Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.756932 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerDied","Data":"54419ef52d04b18470afc9cd9fe1e7776568928b396efb56b1f79342767e7b05"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.760857 4845 generic.go:334] "Generic (PLEG): container finished" podID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerID="db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0" exitCode=0 Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.761178 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.761709 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerDied","Data":"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.761749 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1c009cae-0016-4d35-9773-1e313feb5c4a","Type":"ContainerDied","Data":"4a2f52662f64ac4adafea886384b8aff903812b748685ed30b174caef644d27d"} Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.807702 4845 scope.go:117] "RemoveContainer" containerID="dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.819736 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.819781 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data\") pod \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.819853 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8rlf\" (UniqueName: \"kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf\") pod \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.819959 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bknng\" (UniqueName: \"kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820008 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820028 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle\") pod \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820128 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820162 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820186 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs\") pod \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820205 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs\") pod \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\" (UID: \"62eefe51-d633-47bd-b7b8-1b786cc8bdde\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820308 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.820353 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data\") pod \"1c009cae-0016-4d35-9773-1e313feb5c4a\" (UID: \"1c009cae-0016-4d35-9773-1e313feb5c4a\") " Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.821873 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.824312 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs" (OuterVolumeSpecName: "logs") pod "62eefe51-d633-47bd-b7b8-1b786cc8bdde" (UID: "62eefe51-d633-47bd-b7b8-1b786cc8bdde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.826077 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.831134 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf" (OuterVolumeSpecName: "kube-api-access-v8rlf") pod "62eefe51-d633-47bd-b7b8-1b786cc8bdde" (UID: "62eefe51-d633-47bd-b7b8-1b786cc8bdde"). InnerVolumeSpecName "kube-api-access-v8rlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.835420 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts" (OuterVolumeSpecName: "scripts") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.836865 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng" (OuterVolumeSpecName: "kube-api-access-bknng") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "kube-api-access-bknng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.850381 4845 scope.go:117] "RemoveContainer" containerID="5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" Feb 02 10:55:41 crc kubenswrapper[4845]: E0202 10:55:41.851901 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587\": container with ID starting with 5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587 not found: ID does not exist" containerID="5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.851942 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587"} err="failed to get container status \"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587\": rpc error: code = NotFound desc = could not find container \"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587\": container with ID starting with 5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587 not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.851975 4845 scope.go:117] "RemoveContainer" containerID="dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" Feb 02 10:55:41 crc kubenswrapper[4845]: E0202 10:55:41.854484 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34\": container with ID starting with dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34 not found: ID does not exist" containerID="dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.854524 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34"} err="failed to get container status \"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34\": rpc error: code = NotFound desc = could not find container \"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34\": container with ID starting with dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34 not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.854555 4845 scope.go:117] "RemoveContainer" containerID="5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.856722 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587"} err="failed to get container status \"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587\": rpc error: code = NotFound desc = could not find container \"5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587\": container with ID starting with 5129ed6969a185dc174054937b32f44ecf2b35e06649ff5063fa9400e0bbb587 not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.856767 4845 scope.go:117] "RemoveContainer" containerID="dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.879127 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data" (OuterVolumeSpecName: "config-data") pod "62eefe51-d633-47bd-b7b8-1b786cc8bdde" (UID: "62eefe51-d633-47bd-b7b8-1b786cc8bdde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.879126 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34"} err="failed to get container status \"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34\": rpc error: code = NotFound desc = could not find container \"dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34\": container with ID starting with dc81d950bd12c8d88724602378669183e198dbdb561da383063fc6ae16648d34 not found: ID does not exist" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.879194 4845 scope.go:117] "RemoveContainer" containerID="c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.889512 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "62eefe51-d633-47bd-b7b8-1b786cc8bdde" (UID: "62eefe51-d633-47bd-b7b8-1b786cc8bdde"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.891868 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.894336 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62eefe51-d633-47bd-b7b8-1b786cc8bdde" (UID: "62eefe51-d633-47bd-b7b8-1b786cc8bdde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922865 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922914 4845 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922927 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/62eefe51-d633-47bd-b7b8-1b786cc8bdde-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922939 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922950 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1c009cae-0016-4d35-9773-1e313feb5c4a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922978 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.922989 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8rlf\" (UniqueName: \"kubernetes.io/projected/62eefe51-d633-47bd-b7b8-1b786cc8bdde-kube-api-access-v8rlf\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.923000 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bknng\" (UniqueName: \"kubernetes.io/projected/1c009cae-0016-4d35-9773-1e313feb5c4a-kube-api-access-bknng\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.923010 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.923020 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62eefe51-d633-47bd-b7b8-1b786cc8bdde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.939642 4845 scope.go:117] "RemoveContainer" containerID="66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.952476 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.976120 4845 scope.go:117] "RemoveContainer" containerID="db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0" Feb 02 10:55:41 crc kubenswrapper[4845]: I0202 10:55:41.986039 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data" (OuterVolumeSpecName: "config-data") pod "1c009cae-0016-4d35-9773-1e313feb5c4a" (UID: "1c009cae-0016-4d35-9773-1e313feb5c4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.019414 4845 scope.go:117] "RemoveContainer" containerID="59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.025413 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.025446 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c009cae-0016-4d35-9773-1e313feb5c4a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.056766 4845 scope.go:117] "RemoveContainer" containerID="c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.057299 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699\": container with ID starting with c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699 not found: ID does not exist" containerID="c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057353 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699"} err="failed to get container status \"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699\": rpc error: code = NotFound desc = could not find container \"c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699\": container with ID starting with c4da32ef9938e3a4eb7fa613a519402a486d73d59fbd5be714815e7741131699 not found: ID does not exist" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057388 4845 scope.go:117] "RemoveContainer" containerID="66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.057643 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358\": container with ID starting with 66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358 not found: ID does not exist" containerID="66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057676 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358"} err="failed to get container status \"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358\": rpc error: code = NotFound desc = could not find container \"66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358\": container with ID starting with 66ceb0e8f2a60f19d2266cd291bfbd6bc612a260193a33a15be90f73d2aea358 not found: ID does not exist" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057693 4845 scope.go:117] "RemoveContainer" containerID="db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.057911 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0\": container with ID starting with db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0 not found: ID does not exist" containerID="db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057941 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0"} err="failed to get container status \"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0\": rpc error: code = NotFound desc = could not find container \"db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0\": container with ID starting with db12f045f249fa1f64b095823c8181eff018f6d148c60c7badb2c2c6744d12c0 not found: ID does not exist" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.057958 4845 scope.go:117] "RemoveContainer" containerID="59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.058187 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017\": container with ID starting with 59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017 not found: ID does not exist" containerID="59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.058218 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017"} err="failed to get container status \"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017\": rpc error: code = NotFound desc = could not find container \"59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017\": container with ID starting with 59afef34d3f2b508eb95da3a2f7eae6630ea046d1baa6181ce4bd5b6934bb017 not found: ID does not exist" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.077822 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.097650 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.125576 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.172281 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.173919 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-central-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.173935 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-central-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.173958 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b2bad3a-8153-41d8-83f6-9f9caa16589b" containerName="nova-cell1-conductor-db-sync" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.173965 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b2bad3a-8153-41d8-83f6-9f9caa16589b" containerName="nova-cell1-conductor-db-sync" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.173983 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="sg-core" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.173989 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="sg-core" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174018 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="dnsmasq-dns" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174024 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="dnsmasq-dns" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174038 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7439e987-75e8-4cc8-840a-742c6f07dea9" containerName="nova-manage" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174044 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7439e987-75e8-4cc8-840a-742c6f07dea9" containerName="nova-manage" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174058 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-log" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174064 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-log" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174081 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="proxy-httpd" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174086 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="proxy-httpd" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174103 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="init" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174108 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="init" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174127 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-notification-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174133 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-notification-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.174158 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-metadata" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.174163 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-metadata" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176596 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="sg-core" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176640 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab561fd-1cd4-43c4-a09d-401ca966b4bb" containerName="dnsmasq-dns" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176663 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-metadata" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176677 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-notification-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176697 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7439e987-75e8-4cc8-840a-742c6f07dea9" containerName="nova-manage" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176720 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" containerName="nova-metadata-log" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176741 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="ceilometer-central-agent" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176759 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b2bad3a-8153-41d8-83f6-9f9caa16589b" containerName="nova-cell1-conductor-db-sync" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.176780 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" containerName="proxy-httpd" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.178787 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.180789 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.182060 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.190948 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.203273 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.222347 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.237240 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data\") pod \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.237484 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7jh8\" (UniqueName: \"kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8\") pod \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.237536 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts\") pod \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.237635 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle\") pod \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\" (UID: \"7b2bad3a-8153-41d8-83f6-9f9caa16589b\") " Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.241735 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8" (OuterVolumeSpecName: "kube-api-access-l7jh8") pod "7b2bad3a-8153-41d8-83f6-9f9caa16589b" (UID: "7b2bad3a-8153-41d8-83f6-9f9caa16589b"). InnerVolumeSpecName "kube-api-access-l7jh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.242614 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts" (OuterVolumeSpecName: "scripts") pod "7b2bad3a-8153-41d8-83f6-9f9caa16589b" (UID: "7b2bad3a-8153-41d8-83f6-9f9caa16589b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.245196 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.248099 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.253826 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.255589 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.259596 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.262094 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.285447 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data" (OuterVolumeSpecName: "config-data") pod "7b2bad3a-8153-41d8-83f6-9f9caa16589b" (UID: "7b2bad3a-8153-41d8-83f6-9f9caa16589b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.289076 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b2bad3a-8153-41d8-83f6-9f9caa16589b" (UID: "7b2bad3a-8153-41d8-83f6-9f9caa16589b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340155 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340263 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340548 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340611 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340647 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340760 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.340954 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341124 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlh7l\" (UniqueName: \"kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341229 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341330 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341380 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341461 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341510 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xlk\" (UniqueName: \"kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341633 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7jh8\" (UniqueName: \"kubernetes.io/projected/7b2bad3a-8153-41d8-83f6-9f9caa16589b-kube-api-access-l7jh8\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341649 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341662 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.341675 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b2bad3a-8153-41d8-83f6-9f9caa16589b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443125 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443197 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xlk\" (UniqueName: \"kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443253 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443311 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443474 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443509 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443538 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.443588 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.444513 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.444720 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.444858 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlh7l\" (UniqueName: \"kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.444964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.445047 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.445090 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.445409 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.445580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.448470 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.448634 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.449511 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.449971 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.451685 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.452429 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.452460 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.455657 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.464217 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xlk\" (UniqueName: \"kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk\") pod \"ceilometer-0\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.470423 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlh7l\" (UniqueName: \"kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l\") pod \"nova-metadata-0\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.507574 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.570866 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.780070 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" event={"ID":"7b2bad3a-8153-41d8-83f6-9f9caa16589b","Type":"ContainerDied","Data":"da194489804af83cf7bd974e17436580f313625e8bbc411b3359df8752a517c2"} Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.780288 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da194489804af83cf7bd974e17436580f313625e8bbc411b3359df8752a517c2" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.780194 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-r4lj7" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.795066 4845 generic.go:334] "Generic (PLEG): container finished" podID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerID="86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" exitCode=0 Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.795196 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"516e4c98-314a-4116-b0fc-45c18fd1c7e1","Type":"ContainerDied","Data":"86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22"} Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.860384 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.862111 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.865084 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.870359 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.956544 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22 is running failed: container process not found" containerID="86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.956967 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22 is running failed: container process not found" containerID="86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.957297 4845 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22 is running failed: container process not found" containerID="86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:55:42 crc kubenswrapper[4845]: E0202 10:55:42.957331 4845 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerName="nova-scheduler-scheduler" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.958071 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nxz\" (UniqueName: \"kubernetes.io/projected/039d1d72-0f72-4172-a037-ea289c8d7fbb-kube-api-access-p8nxz\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.958194 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:42 crc kubenswrapper[4845]: I0202 10:55:42.958242 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.060258 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.060310 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.060546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nxz\" (UniqueName: \"kubernetes.io/projected/039d1d72-0f72-4172-a037-ea289c8d7fbb-kube-api-access-p8nxz\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.066250 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.066729 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039d1d72-0f72-4172-a037-ea289c8d7fbb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.082530 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nxz\" (UniqueName: \"kubernetes.io/projected/039d1d72-0f72-4172-a037-ea289c8d7fbb-kube-api-access-p8nxz\") pod \"nova-cell1-conductor-0\" (UID: \"039d1d72-0f72-4172-a037-ea289c8d7fbb\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.135417 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:55:43 crc kubenswrapper[4845]: W0202 10:55:43.140980 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab48fc91_e9f1_4362_8cb8_091846601a7e.slice/crio-03916b40dcfb1d58e9a1856cf115e52af0968b63f0ef6c849ed5ef4cac5ddc0d WatchSource:0}: Error finding container 03916b40dcfb1d58e9a1856cf115e52af0968b63f0ef6c849ed5ef4cac5ddc0d: Status 404 returned error can't find the container with id 03916b40dcfb1d58e9a1856cf115e52af0968b63f0ef6c849ed5ef4cac5ddc0d Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.191393 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.245230 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:55:43 crc kubenswrapper[4845]: W0202 10:55:43.249729 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8162bea_daa1_42e3_8921_3c12ad56dfa6.slice/crio-2275bc3cbd2ca4cb0365fb35199c6ba0dd53592c51fe579e48d78b478ab3a151 WatchSource:0}: Error finding container 2275bc3cbd2ca4cb0365fb35199c6ba0dd53592c51fe579e48d78b478ab3a151: Status 404 returned error can't find the container with id 2275bc3cbd2ca4cb0365fb35199c6ba0dd53592c51fe579e48d78b478ab3a151 Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.258432 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.367851 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle\") pod \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.368410 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gchv8\" (UniqueName: \"kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8\") pod \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.368866 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data\") pod \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\" (UID: \"516e4c98-314a-4116-b0fc-45c18fd1c7e1\") " Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.377072 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8" (OuterVolumeSpecName: "kube-api-access-gchv8") pod "516e4c98-314a-4116-b0fc-45c18fd1c7e1" (UID: "516e4c98-314a-4116-b0fc-45c18fd1c7e1"). InnerVolumeSpecName "kube-api-access-gchv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.406813 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "516e4c98-314a-4116-b0fc-45c18fd1c7e1" (UID: "516e4c98-314a-4116-b0fc-45c18fd1c7e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.422449 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data" (OuterVolumeSpecName: "config-data") pod "516e4c98-314a-4116-b0fc-45c18fd1c7e1" (UID: "516e4c98-314a-4116-b0fc-45c18fd1c7e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.473010 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gchv8\" (UniqueName: \"kubernetes.io/projected/516e4c98-314a-4116-b0fc-45c18fd1c7e1-kube-api-access-gchv8\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.473214 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.473310 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516e4c98-314a-4116-b0fc-45c18fd1c7e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.744865 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c009cae-0016-4d35-9773-1e313feb5c4a" path="/var/lib/kubelet/pods/1c009cae-0016-4d35-9773-1e313feb5c4a/volumes" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.747156 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62eefe51-d633-47bd-b7b8-1b786cc8bdde" path="/var/lib/kubelet/pods/62eefe51-d633-47bd-b7b8-1b786cc8bdde/volumes" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.748238 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.819626 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerStarted","Data":"2275bc3cbd2ca4cb0365fb35199c6ba0dd53592c51fe579e48d78b478ab3a151"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.821987 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.821986 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"516e4c98-314a-4116-b0fc-45c18fd1c7e1","Type":"ContainerDied","Data":"ad10cf81d533e8b37ef72f4ca4f01fc9f32600963e1feb8c42617710f205cfdd"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.822140 4845 scope.go:117] "RemoveContainer" containerID="86703aa66625a53c7ebcc31b1564a8020231b6947bd8876cc1af251a9048bf22" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.825183 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerStarted","Data":"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.825231 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerStarted","Data":"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.825247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerStarted","Data":"03916b40dcfb1d58e9a1856cf115e52af0968b63f0ef6c849ed5ef4cac5ddc0d"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.827714 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"039d1d72-0f72-4172-a037-ea289c8d7fbb","Type":"ContainerStarted","Data":"82ad75b278c8ed413545045dd321895af2d2e947bd739995535cc4a4f30a4b1d"} Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.866720 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.866667914 podStartE2EDuration="1.866667914s" podCreationTimestamp="2026-02-02 10:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:43.844053143 +0000 UTC m=+1424.935454593" watchObservedRunningTime="2026-02-02 10:55:43.866667914 +0000 UTC m=+1424.958069364" Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.938666 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:43 crc kubenswrapper[4845]: I0202 10:55:43.981964 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.003566 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:44 crc kubenswrapper[4845]: E0202 10:55:44.004254 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerName="nova-scheduler-scheduler" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.004279 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerName="nova-scheduler-scheduler" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.004679 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" containerName="nova-scheduler-scheduler" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.006072 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.011954 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.017630 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.105519 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.105976 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnk77\" (UniqueName: \"kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.106077 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.209249 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.209313 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnk77\" (UniqueName: \"kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.209379 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.213910 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.217483 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.233914 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnk77\" (UniqueName: \"kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77\") pod \"nova-scheduler-0\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.331498 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.839157 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"039d1d72-0f72-4172-a037-ea289c8d7fbb","Type":"ContainerStarted","Data":"6a3f53c8d4d64b0fbda22be0ce7ca0c2081aa58ea492ba52ffebc44dcbc5dae9"} Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.841234 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.845724 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerStarted","Data":"2eb041a66024bd6fdf0a4047a860f057456d28471e9ac67391405774838038c1"} Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.872072 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.872028746 podStartE2EDuration="2.872028746s" podCreationTimestamp="2026-02-02 10:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:44.858751549 +0000 UTC m=+1425.950152999" watchObservedRunningTime="2026-02-02 10:55:44.872028746 +0000 UTC m=+1425.963430196" Feb 02 10:55:44 crc kubenswrapper[4845]: I0202 10:55:44.936032 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.731043 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="516e4c98-314a-4116-b0fc-45c18fd1c7e1" path="/var/lib/kubelet/pods/516e4c98-314a-4116-b0fc-45c18fd1c7e1/volumes" Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.861927 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerStarted","Data":"4e6899cd823b95ed877f86a8c1dd20bcde7abe8ab4e5dac19c5ab5f2085fe814"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.861983 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerStarted","Data":"e82a8977aa57e2cfa579b4996bfd47a417b1401100d516e3057e129ade68d451"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.865121 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"652d8576-912d-4384-b487-aa6b987b567f","Type":"ContainerStarted","Data":"444f6ac6315562315a504c0600f21ab36cd45bf454f4002a51c1be86e2c6f5bb"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.865186 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"652d8576-912d-4384-b487-aa6b987b567f","Type":"ContainerStarted","Data":"9d162ecc4fb3460600a969e51ca75c36688e2c03aaec73046e664f22f56be6a6"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.870212 4845 generic.go:334] "Generic (PLEG): container finished" podID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerID="7371036457c908a60d98a53f34d54f0d70618efbebcf97fbc3e3f1b041ae7110" exitCode=0 Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.871556 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerDied","Data":"7371036457c908a60d98a53f34d54f0d70618efbebcf97fbc3e3f1b041ae7110"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.871827 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1498a0e1-1035-4eba-bbc5-169cd1de86a0","Type":"ContainerDied","Data":"f57f9e17a253f3027639c6870894363d5912e86c0ad65390204a822d4d8ff33a"} Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.872142 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57f9e17a253f3027639c6870894363d5912e86c0ad65390204a822d4d8ff33a" Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.881541 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.888917 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.888874904 podStartE2EDuration="2.888874904s" podCreationTimestamp="2026-02-02 10:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:45.884251219 +0000 UTC m=+1426.975652669" watchObservedRunningTime="2026-02-02 10:55:45.888874904 +0000 UTC m=+1426.980276354" Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.952154 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs\") pod \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.952289 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data\") pod \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.952308 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle\") pod \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.952413 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgljn\" (UniqueName: \"kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn\") pod \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\" (UID: \"1498a0e1-1035-4eba-bbc5-169cd1de86a0\") " Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.953181 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs" (OuterVolumeSpecName: "logs") pod "1498a0e1-1035-4eba-bbc5-169cd1de86a0" (UID: "1498a0e1-1035-4eba-bbc5-169cd1de86a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:45 crc kubenswrapper[4845]: I0202 10:55:45.976914 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn" (OuterVolumeSpecName: "kube-api-access-zgljn") pod "1498a0e1-1035-4eba-bbc5-169cd1de86a0" (UID: "1498a0e1-1035-4eba-bbc5-169cd1de86a0"). InnerVolumeSpecName "kube-api-access-zgljn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:45.999777 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1498a0e1-1035-4eba-bbc5-169cd1de86a0" (UID: "1498a0e1-1035-4eba-bbc5-169cd1de86a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.006066 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data" (OuterVolumeSpecName: "config-data") pod "1498a0e1-1035-4eba-bbc5-169cd1de86a0" (UID: "1498a0e1-1035-4eba-bbc5-169cd1de86a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.055579 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgljn\" (UniqueName: \"kubernetes.io/projected/1498a0e1-1035-4eba-bbc5-169cd1de86a0-kube-api-access-zgljn\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.055621 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1498a0e1-1035-4eba-bbc5-169cd1de86a0-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.055635 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.055647 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1498a0e1-1035-4eba-bbc5-169cd1de86a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.882008 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.925135 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.938735 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.962677 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:46 crc kubenswrapper[4845]: E0202 10:55:46.963362 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-api" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.963384 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-api" Feb 02 10:55:46 crc kubenswrapper[4845]: E0202 10:55:46.963399 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-log" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.963407 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-log" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.963650 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-log" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.963709 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" containerName="nova-api-api" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.965128 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.968640 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:55:46 crc kubenswrapper[4845]: I0202 10:55:46.988948 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.083380 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.083732 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98z8\" (UniqueName: \"kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.083933 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.084002 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.125476 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.185990 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.186108 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.186247 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.186312 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m98z8\" (UniqueName: \"kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.191927 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.192051 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.215525 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98z8\" (UniqueName: \"kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.235173 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data\") pod \"nova-api-0\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.297192 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.509829 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.511796 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.727580 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1498a0e1-1035-4eba-bbc5-169cd1de86a0" path="/var/lib/kubelet/pods/1498a0e1-1035-4eba-bbc5-169cd1de86a0/volumes" Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.799764 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.896443 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerStarted","Data":"07a8bb175c6ab486cf8171570281ab764339f6cc80042312c0c66b5beebb4313"} Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.902241 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerStarted","Data":"6656f047e77f8e3107d54021260879721254382d799001121bd164ebae15d9c9"} Feb 02 10:55:47 crc kubenswrapper[4845]: I0202 10:55:47.902308 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:55:48 crc kubenswrapper[4845]: I0202 10:55:48.915408 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerStarted","Data":"0bac2a8d77845121bf798ed7e7bbbcdf0da36dc5492619b8084fe476f2575739"} Feb 02 10:55:48 crc kubenswrapper[4845]: I0202 10:55:48.916041 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerStarted","Data":"b18cfab9ee8f5f4276f99dca37c12a86dbf672950e6b072ff69b9e4ffc29f67d"} Feb 02 10:55:48 crc kubenswrapper[4845]: I0202 10:55:48.941853 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.950177499 podStartE2EDuration="6.941829568s" podCreationTimestamp="2026-02-02 10:55:42 +0000 UTC" firstStartedPulling="2026-02-02 10:55:43.2525119 +0000 UTC m=+1424.343913360" lastFinishedPulling="2026-02-02 10:55:47.244163979 +0000 UTC m=+1428.335565429" observedRunningTime="2026-02-02 10:55:47.926396931 +0000 UTC m=+1429.017798391" watchObservedRunningTime="2026-02-02 10:55:48.941829568 +0000 UTC m=+1430.033231018" Feb 02 10:55:48 crc kubenswrapper[4845]: I0202 10:55:48.945305 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.945287639 podStartE2EDuration="2.945287639s" podCreationTimestamp="2026-02-02 10:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:55:48.932737403 +0000 UTC m=+1430.024138853" watchObservedRunningTime="2026-02-02 10:55:48.945287639 +0000 UTC m=+1430.036689099" Feb 02 10:55:49 crc kubenswrapper[4845]: I0202 10:55:49.332328 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:55:52 crc kubenswrapper[4845]: I0202 10:55:52.507923 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:55:52 crc kubenswrapper[4845]: I0202 10:55:52.510081 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:55:53 crc kubenswrapper[4845]: I0202 10:55:53.224370 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:53 crc kubenswrapper[4845]: I0202 10:55:53.520071 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:55:53 crc kubenswrapper[4845]: I0202 10:55:53.520082 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:55:54 crc kubenswrapper[4845]: I0202 10:55:54.332664 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:55:54 crc kubenswrapper[4845]: I0202 10:55:54.366276 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:55:55 crc kubenswrapper[4845]: I0202 10:55:55.026541 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:55:57 crc kubenswrapper[4845]: I0202 10:55:57.298052 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:55:57 crc kubenswrapper[4845]: I0202 10:55:57.298569 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:55:58 crc kubenswrapper[4845]: I0202 10:55:58.381278 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:55:58 crc kubenswrapper[4845]: I0202 10:55:58.381295 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:02 crc kubenswrapper[4845]: I0202 10:56:02.513365 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:56:02 crc kubenswrapper[4845]: I0202 10:56:02.514181 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:56:02 crc kubenswrapper[4845]: I0202 10:56:02.521557 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:56:02 crc kubenswrapper[4845]: I0202 10:56:02.521751 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:56:04 crc kubenswrapper[4845]: I0202 10:56:04.911011 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.062037 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7gzv\" (UniqueName: \"kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv\") pod \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.062381 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle\") pod \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.062792 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data\") pod \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\" (UID: \"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98\") " Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.071585 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv" (OuterVolumeSpecName: "kube-api-access-p7gzv") pod "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" (UID: "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98"). InnerVolumeSpecName "kube-api-access-p7gzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.119098 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data" (OuterVolumeSpecName: "config-data") pod "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" (UID: "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.166749 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7gzv\" (UniqueName: \"kubernetes.io/projected/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-kube-api-access-p7gzv\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.166795 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.167383 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" (UID: "6b1dccba-b1ef-4b7c-aa80-ec15529e7a98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.203131 4845 generic.go:334] "Generic (PLEG): container finished" podID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" containerID="f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad" exitCode=137 Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.203191 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98","Type":"ContainerDied","Data":"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad"} Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.203226 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b1dccba-b1ef-4b7c-aa80-ec15529e7a98","Type":"ContainerDied","Data":"fe04f70476c707b2543738532ef74000620c8c2924f735fde10bff5d95053cb3"} Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.203249 4845 scope.go:117] "RemoveContainer" containerID="f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.203451 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.268948 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.269169 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.276125 4845 scope.go:117] "RemoveContainer" containerID="f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad" Feb 02 10:56:05 crc kubenswrapper[4845]: E0202 10:56:05.278296 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad\": container with ID starting with f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad not found: ID does not exist" containerID="f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.278331 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad"} err="failed to get container status \"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad\": rpc error: code = NotFound desc = could not find container \"f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad\": container with ID starting with f2b8976e3e955a3582976396490af97d79417dc4788630c356c0e93e5f11bcad not found: ID does not exist" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.289476 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.318974 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:56:05 crc kubenswrapper[4845]: E0202 10:56:05.319666 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.319696 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.320089 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.321216 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.330293 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.330358 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.330311 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.333169 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.474932 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72h9m\" (UniqueName: \"kubernetes.io/projected/85bf6fdc-0816-4f80-966c-426f4906c581-kube-api-access-72h9m\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.475121 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.475154 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.475287 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.475377 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.579933 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.580567 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.580710 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.580835 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.581000 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72h9m\" (UniqueName: \"kubernetes.io/projected/85bf6fdc-0816-4f80-966c-426f4906c581-kube-api-access-72h9m\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.586237 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.586756 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.588215 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.589815 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/85bf6fdc-0816-4f80-966c-426f4906c581-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.601584 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72h9m\" (UniqueName: \"kubernetes.io/projected/85bf6fdc-0816-4f80-966c-426f4906c581-kube-api-access-72h9m\") pod \"nova-cell1-novncproxy-0\" (UID: \"85bf6fdc-0816-4f80-966c-426f4906c581\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.651211 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:05 crc kubenswrapper[4845]: I0202 10:56:05.729695 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1dccba-b1ef-4b7c-aa80-ec15529e7a98" path="/var/lib/kubelet/pods/6b1dccba-b1ef-4b7c-aa80-ec15529e7a98/volumes" Feb 02 10:56:06 crc kubenswrapper[4845]: I0202 10:56:06.143241 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:56:06 crc kubenswrapper[4845]: I0202 10:56:06.219757 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85bf6fdc-0816-4f80-966c-426f4906c581","Type":"ContainerStarted","Data":"57f9670fd6dc1ac7c811dd9e262312cf03489eba76b3d88e6deb5fa9fa5ebd8a"} Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.233453 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"85bf6fdc-0816-4f80-966c-426f4906c581","Type":"ContainerStarted","Data":"01a13e096495cac9c42990a9d83ca89789cbc700b5b4b2fc0018bb90dc2ecccb"} Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.260376 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.260349724 podStartE2EDuration="2.260349724s" podCreationTimestamp="2026-02-02 10:56:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:07.251333891 +0000 UTC m=+1448.342735341" watchObservedRunningTime="2026-02-02 10:56:07.260349724 +0000 UTC m=+1448.351751174" Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.302138 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.302681 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.302790 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:56:07 crc kubenswrapper[4845]: I0202 10:56:07.304695 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.257823 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.271818 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.441916 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-785dh"] Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.444299 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.465249 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-785dh"] Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.564833 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bsf\" (UniqueName: \"kubernetes.io/projected/330a4322-2c1c-4f9a-9093-bfae422cc1fb-kube-api-access-s4bsf\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.564906 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.564953 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.565024 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.565066 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.565089 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667256 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667306 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667417 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bsf\" (UniqueName: \"kubernetes.io/projected/330a4322-2c1c-4f9a-9093-bfae422cc1fb-kube-api-access-s4bsf\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667437 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.667556 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.668311 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.668329 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.668417 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.668699 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.668876 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a4322-2c1c-4f9a-9093-bfae422cc1fb-config\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.712154 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bsf\" (UniqueName: \"kubernetes.io/projected/330a4322-2c1c-4f9a-9093-bfae422cc1fb-kube-api-access-s4bsf\") pod \"dnsmasq-dns-6b7bbf7cf9-785dh\" (UID: \"330a4322-2c1c-4f9a-9093-bfae422cc1fb\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:08 crc kubenswrapper[4845]: I0202 10:56:08.790582 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:09 crc kubenswrapper[4845]: I0202 10:56:09.421384 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-785dh"] Feb 02 10:56:10 crc kubenswrapper[4845]: I0202 10:56:10.294039 4845 generic.go:334] "Generic (PLEG): container finished" podID="330a4322-2c1c-4f9a-9093-bfae422cc1fb" containerID="a24ae3b07619062acab47128eaac209a8d18956597107a5ffa53abad39c595ee" exitCode=0 Feb 02 10:56:10 crc kubenswrapper[4845]: I0202 10:56:10.294100 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" event={"ID":"330a4322-2c1c-4f9a-9093-bfae422cc1fb","Type":"ContainerDied","Data":"a24ae3b07619062acab47128eaac209a8d18956597107a5ffa53abad39c595ee"} Feb 02 10:56:10 crc kubenswrapper[4845]: I0202 10:56:10.294505 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" event={"ID":"330a4322-2c1c-4f9a-9093-bfae422cc1fb","Type":"ContainerStarted","Data":"60f36e3b096718034ca98b3ef58919759a9c1e7b337d094bb4ca2e6a5d3d5de0"} Feb 02 10:56:10 crc kubenswrapper[4845]: I0202 10:56:10.653288 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.063130 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.064012 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-central-agent" containerID="cri-o://2eb041a66024bd6fdf0a4047a860f057456d28471e9ac67391405774838038c1" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.064156 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" containerID="cri-o://6656f047e77f8e3107d54021260879721254382d799001121bd164ebae15d9c9" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.064221 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="sg-core" containerID="cri-o://4e6899cd823b95ed877f86a8c1dd20bcde7abe8ab4e5dac19c5ab5f2085fe814" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.064273 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-notification-agent" containerID="cri-o://e82a8977aa57e2cfa579b4996bfd47a417b1401100d516e3057e129ade68d451" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.090017 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.260529 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.328011 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" event={"ID":"330a4322-2c1c-4f9a-9093-bfae422cc1fb","Type":"ContainerStarted","Data":"f80ccc3b5c32b7ad0f9e13715694bc96f5f8cedbeb2ed5bd60dac4af17084777"} Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.328094 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340032 4845 generic.go:334] "Generic (PLEG): container finished" podID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerID="6656f047e77f8e3107d54021260879721254382d799001121bd164ebae15d9c9" exitCode=0 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340070 4845 generic.go:334] "Generic (PLEG): container finished" podID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerID="4e6899cd823b95ed877f86a8c1dd20bcde7abe8ab4e5dac19c5ab5f2085fe814" exitCode=2 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340130 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerDied","Data":"6656f047e77f8e3107d54021260879721254382d799001121bd164ebae15d9c9"} Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340189 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerDied","Data":"4e6899cd823b95ed877f86a8c1dd20bcde7abe8ab4e5dac19c5ab5f2085fe814"} Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340275 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-log" containerID="cri-o://b18cfab9ee8f5f4276f99dca37c12a86dbf672950e6b072ff69b9e4ffc29f67d" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.340392 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-api" containerID="cri-o://0bac2a8d77845121bf798ed7e7bbbcdf0da36dc5492619b8084fe476f2575739" gracePeriod=30 Feb 02 10:56:11 crc kubenswrapper[4845]: I0202 10:56:11.395667 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" podStartSLOduration=3.395647236 podStartE2EDuration="3.395647236s" podCreationTimestamp="2026-02-02 10:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:11.373109899 +0000 UTC m=+1452.464511349" watchObservedRunningTime="2026-02-02 10:56:11.395647236 +0000 UTC m=+1452.487048686" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.363822 4845 generic.go:334] "Generic (PLEG): container finished" podID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerID="e82a8977aa57e2cfa579b4996bfd47a417b1401100d516e3057e129ade68d451" exitCode=0 Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.363856 4845 generic.go:334] "Generic (PLEG): container finished" podID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerID="2eb041a66024bd6fdf0a4047a860f057456d28471e9ac67391405774838038c1" exitCode=0 Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.363931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerDied","Data":"e82a8977aa57e2cfa579b4996bfd47a417b1401100d516e3057e129ade68d451"} Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.363963 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerDied","Data":"2eb041a66024bd6fdf0a4047a860f057456d28471e9ac67391405774838038c1"} Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.371011 4845 generic.go:334] "Generic (PLEG): container finished" podID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerID="b18cfab9ee8f5f4276f99dca37c12a86dbf672950e6b072ff69b9e4ffc29f67d" exitCode=143 Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.372441 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerDied","Data":"b18cfab9ee8f5f4276f99dca37c12a86dbf672950e6b072ff69b9e4ffc29f67d"} Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.620584 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.661917 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.661981 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44xlk\" (UniqueName: \"kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662021 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662114 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662464 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662493 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662555 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.662589 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data\") pod \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\" (UID: \"c8162bea-daa1-42e3-8921-3c12ad56dfa6\") " Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.663327 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.663675 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.663734 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.670052 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk" (OuterVolumeSpecName: "kube-api-access-44xlk") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "kube-api-access-44xlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.703836 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts" (OuterVolumeSpecName: "scripts") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.773817 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8162bea-daa1-42e3-8921-3c12ad56dfa6-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.773858 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.773867 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44xlk\" (UniqueName: \"kubernetes.io/projected/c8162bea-daa1-42e3-8921-3c12ad56dfa6-kube-api-access-44xlk\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.792164 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.840055 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.876711 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.876991 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.877995 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:12 crc kubenswrapper[4845]: I0202 10:56:12.980314 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.013700 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data" (OuterVolumeSpecName: "config-data") pod "c8162bea-daa1-42e3-8921-3c12ad56dfa6" (UID: "c8162bea-daa1-42e3-8921-3c12ad56dfa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.084645 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8162bea-daa1-42e3-8921-3c12ad56dfa6-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.387604 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8162bea-daa1-42e3-8921-3c12ad56dfa6","Type":"ContainerDied","Data":"2275bc3cbd2ca4cb0365fb35199c6ba0dd53592c51fe579e48d78b478ab3a151"} Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.387657 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.387672 4845 scope.go:117] "RemoveContainer" containerID="6656f047e77f8e3107d54021260879721254382d799001121bd164ebae15d9c9" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.419685 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.425736 4845 scope.go:117] "RemoveContainer" containerID="4e6899cd823b95ed877f86a8c1dd20bcde7abe8ab4e5dac19c5ab5f2085fe814" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.431385 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.456969 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:13 crc kubenswrapper[4845]: E0202 10:56:13.458459 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.458490 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" Feb 02 10:56:13 crc kubenswrapper[4845]: E0202 10:56:13.458511 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="sg-core" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.458520 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="sg-core" Feb 02 10:56:13 crc kubenswrapper[4845]: E0202 10:56:13.458594 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-central-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.458608 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-central-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: E0202 10:56:13.458623 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-notification-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.458637 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-notification-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.462416 4845 scope.go:117] "RemoveContainer" containerID="e82a8977aa57e2cfa579b4996bfd47a417b1401100d516e3057e129ade68d451" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.463345 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.463415 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="sg-core" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.463454 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-central-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.463473 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="ceilometer-notification-agent" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.477808 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.481769 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.482011 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.481905 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.513413 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.565934 4845 scope.go:117] "RemoveContainer" containerID="2eb041a66024bd6fdf0a4047a860f057456d28471e9ac67391405774838038c1" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621347 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrkdz\" (UniqueName: \"kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621422 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621465 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621489 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621505 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621544 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621577 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.621595 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723219 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrkdz\" (UniqueName: \"kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723309 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723367 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723404 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723424 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723527 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.723556 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.724143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.724404 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.728081 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" path="/var/lib/kubelet/pods/c8162bea-daa1-42e3-8921-3c12ad56dfa6/volumes" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.740709 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.741759 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.741990 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.742115 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.742412 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.747127 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrkdz\" (UniqueName: \"kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz\") pod \"ceilometer-0\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.864715 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:13 crc kubenswrapper[4845]: I0202 10:56:13.888076 4845 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod516e4c98-314a-4116-b0fc-45c18fd1c7e1"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod516e4c98-314a-4116-b0fc-45c18fd1c7e1] : Timed out while waiting for systemd to remove kubepods-besteffort-pod516e4c98_314a_4116_b0fc_45c18fd1c7e1.slice" Feb 02 10:56:14 crc kubenswrapper[4845]: I0202 10:56:14.205933 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:14 crc kubenswrapper[4845]: I0202 10:56:14.391533 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:14 crc kubenswrapper[4845]: W0202 10:56:14.393053 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41f8f678_af44_4ddc_b2db_01f96bae8601.slice/crio-568792204d8f6e91541e175f4c66f405089f0798e5551671c568bec6667a39ac WatchSource:0}: Error finding container 568792204d8f6e91541e175f4c66f405089f0798e5551671c568bec6667a39ac: Status 404 returned error can't find the container with id 568792204d8f6e91541e175f4c66f405089f0798e5551671c568bec6667a39ac Feb 02 10:56:15 crc kubenswrapper[4845]: I0202 10:56:15.413234 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerStarted","Data":"27f273dfdbc6f48a241cefa433f71908752cdbef7c94c8988ca4fbc3330d687c"} Feb 02 10:56:15 crc kubenswrapper[4845]: I0202 10:56:15.413505 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerStarted","Data":"568792204d8f6e91541e175f4c66f405089f0798e5551671c568bec6667a39ac"} Feb 02 10:56:15 crc kubenswrapper[4845]: I0202 10:56:15.652132 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:15 crc kubenswrapper[4845]: I0202 10:56:15.675538 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.237256 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.237629 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.432687 4845 generic.go:334] "Generic (PLEG): container finished" podID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerID="0bac2a8d77845121bf798ed7e7bbbcdf0da36dc5492619b8084fe476f2575739" exitCode=0 Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.432799 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerDied","Data":"0bac2a8d77845121bf798ed7e7bbbcdf0da36dc5492619b8084fe476f2575739"} Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.440141 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerStarted","Data":"2260ccaf2924a7de756705c5ed9aff5c1bd71d1d2fc36f15c40d4d5dc70e271f"} Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.468664 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.570782 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.699987 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data\") pod \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.700072 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle\") pod \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.700292 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m98z8\" (UniqueName: \"kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8\") pod \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.700473 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs\") pod \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\" (UID: \"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55\") " Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.701114 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs" (OuterVolumeSpecName: "logs") pod "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" (UID: "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.701623 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.706911 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8" (OuterVolumeSpecName: "kube-api-access-m98z8") pod "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" (UID: "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55"). InnerVolumeSpecName "kube-api-access-m98z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.742813 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5zwdv"] Feb 02 10:56:16 crc kubenswrapper[4845]: E0202 10:56:16.743366 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-log" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.743383 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-log" Feb 02 10:56:16 crc kubenswrapper[4845]: E0202 10:56:16.743419 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-api" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.743426 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-api" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.743639 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-log" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.743672 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" containerName="nova-api-api" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.744483 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.775342 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.775549 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.797022 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" (UID: "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.807780 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5zwdv"] Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.813609 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data" (OuterVolumeSpecName: "config-data") pod "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" (UID: "7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.813699 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814034 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814129 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814360 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8k2h\" (UniqueName: \"kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814481 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814494 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.814505 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m98z8\" (UniqueName: \"kubernetes.io/projected/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55-kube-api-access-m98z8\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.919576 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.919878 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8k2h\" (UniqueName: \"kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.919956 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.920257 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.924007 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.946387 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8k2h\" (UniqueName: \"kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.947348 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:16 crc kubenswrapper[4845]: I0202 10:56:16.947733 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts\") pod \"nova-cell1-cell-mapping-5zwdv\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.102037 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.455176 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerStarted","Data":"a59beb72591810706d3416b2fffeb070930f113386bdfdb4800fa23b079aa1da"} Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.457535 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.457591 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55","Type":"ContainerDied","Data":"07a8bb175c6ab486cf8171570281ab764339f6cc80042312c0c66b5beebb4313"} Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.457667 4845 scope.go:117] "RemoveContainer" containerID="0bac2a8d77845121bf798ed7e7bbbcdf0da36dc5492619b8084fe476f2575739" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.490032 4845 scope.go:117] "RemoveContainer" containerID="b18cfab9ee8f5f4276f99dca37c12a86dbf672950e6b072ff69b9e4ffc29f67d" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.548960 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.581957 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.618681 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.620548 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.630686 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.634977 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.636271 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.651944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.734925 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55" path="/var/lib/kubelet/pods/7a2a4f87-44b1-4d9b-b4a9-687bbf6d2e55/volumes" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.735708 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5zwdv"] Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749432 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749633 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749717 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749788 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749870 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.749948 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wfh5\" (UniqueName: \"kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.852933 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.853085 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.853972 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.854040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wfh5\" (UniqueName: \"kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.854162 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.854353 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.854486 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.859905 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.860235 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.861644 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.861944 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.877282 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wfh5\" (UniqueName: \"kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5\") pod \"nova-api-0\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " pod="openstack/nova-api-0" Feb 02 10:56:17 crc kubenswrapper[4845]: I0202 10:56:17.946359 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.472203 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5zwdv" event={"ID":"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b","Type":"ContainerStarted","Data":"8584a98006ccb553e74a988ef9d575c49271ebfc304ac836fbd4745ff8e13b8d"} Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.472566 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5zwdv" event={"ID":"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b","Type":"ContainerStarted","Data":"29f5f8cab78cd261d0abd37ffd5974b0926bb3d10de649b68d9d21e2b1208d82"} Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.512444 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5zwdv" podStartSLOduration=2.512420627 podStartE2EDuration="2.512420627s" podCreationTimestamp="2026-02-02 10:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:18.502707604 +0000 UTC m=+1459.594109054" watchObservedRunningTime="2026-02-02 10:56:18.512420627 +0000 UTC m=+1459.603822067" Feb 02 10:56:18 crc kubenswrapper[4845]: W0202 10:56:18.549350 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod722f6d8a_3c97_4061_b26a_f8ec00f65006.slice/crio-33b69e32755cf4bbe30b319fc4768886cb90925b277cb1bb998366cf1a27877f WatchSource:0}: Error finding container 33b69e32755cf4bbe30b319fc4768886cb90925b277cb1bb998366cf1a27877f: Status 404 returned error can't find the container with id 33b69e32755cf4bbe30b319fc4768886cb90925b277cb1bb998366cf1a27877f Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.560387 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.793053 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-785dh" Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.871391 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:56:18 crc kubenswrapper[4845]: I0202 10:56:18.871699 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="dnsmasq-dns" containerID="cri-o://4588a5acb74ef82abd161d26d28f375cb1f56efac97a86455f032c5662e7188e" gracePeriod=10 Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.542513 4845 generic.go:334] "Generic (PLEG): container finished" podID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerID="4588a5acb74ef82abd161d26d28f375cb1f56efac97a86455f032c5662e7188e" exitCode=0 Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.542843 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" event={"ID":"83cd6f6d-3615-46e0-875a-e1cec10e9631","Type":"ContainerDied","Data":"4588a5acb74ef82abd161d26d28f375cb1f56efac97a86455f032c5662e7188e"} Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.547131 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerStarted","Data":"a76cab8c49a716e1851e502e56d3f3205dc6fe1e003ead53af25ac18bdab30b7"} Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.547170 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerStarted","Data":"64b410e414340d61d34d86ceee1eabcc7870da5967de863190b7a1e715af98f3"} Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.547181 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerStarted","Data":"33b69e32755cf4bbe30b319fc4768886cb90925b277cb1bb998366cf1a27877f"} Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.609864 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.609838104 podStartE2EDuration="2.609838104s" podCreationTimestamp="2026-02-02 10:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:19.571256789 +0000 UTC m=+1460.662658259" watchObservedRunningTime="2026-02-02 10:56:19.609838104 +0000 UTC m=+1460.701239554" Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.627604 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751443 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751609 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751668 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751742 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751771 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.751820 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f27jh\" (UniqueName: \"kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh\") pod \"83cd6f6d-3615-46e0-875a-e1cec10e9631\" (UID: \"83cd6f6d-3615-46e0-875a-e1cec10e9631\") " Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.768577 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh" (OuterVolumeSpecName: "kube-api-access-f27jh") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "kube-api-access-f27jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:19 crc kubenswrapper[4845]: I0202 10:56:19.855380 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f27jh\" (UniqueName: \"kubernetes.io/projected/83cd6f6d-3615-46e0-875a-e1cec10e9631-kube-api-access-f27jh\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.018016 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.024352 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.036134 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.048434 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.048789 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config" (OuterVolumeSpecName: "config") pod "83cd6f6d-3615-46e0-875a-e1cec10e9631" (UID: "83cd6f6d-3615-46e0-875a-e1cec10e9631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.060285 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.060318 4845 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.060329 4845 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.060339 4845 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.060362 4845 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83cd6f6d-3615-46e0-875a-e1cec10e9631-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.558637 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" event={"ID":"83cd6f6d-3615-46e0-875a-e1cec10e9631","Type":"ContainerDied","Data":"5459dadbda58e0ce878baeea7244a3a46fe1a38ff8b8032f5afcf3e9f7c8bd0d"} Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.558670 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-hvdzc" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.558704 4845 scope.go:117] "RemoveContainer" containerID="4588a5acb74ef82abd161d26d28f375cb1f56efac97a86455f032c5662e7188e" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.563354 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerStarted","Data":"5f868885fdf4c7ccdaf9be4e2929167b43f5c44340ad27b9a68723598a5a4d1d"} Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.563627 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="proxy-httpd" containerID="cri-o://5f868885fdf4c7ccdaf9be4e2929167b43f5c44340ad27b9a68723598a5a4d1d" gracePeriod=30 Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.563637 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="sg-core" containerID="cri-o://a59beb72591810706d3416b2fffeb070930f113386bdfdb4800fa23b079aa1da" gracePeriod=30 Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.563599 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-central-agent" containerID="cri-o://27f273dfdbc6f48a241cefa433f71908752cdbef7c94c8988ca4fbc3330d687c" gracePeriod=30 Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.563744 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-notification-agent" containerID="cri-o://2260ccaf2924a7de756705c5ed9aff5c1bd71d1d2fc36f15c40d4d5dc70e271f" gracePeriod=30 Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.588099 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.575834855 podStartE2EDuration="7.588076886s" podCreationTimestamp="2026-02-02 10:56:13 +0000 UTC" firstStartedPulling="2026-02-02 10:56:14.396530081 +0000 UTC m=+1455.487931541" lastFinishedPulling="2026-02-02 10:56:19.408772122 +0000 UTC m=+1460.500173572" observedRunningTime="2026-02-02 10:56:20.587215191 +0000 UTC m=+1461.678616641" watchObservedRunningTime="2026-02-02 10:56:20.588076886 +0000 UTC m=+1461.679478336" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.602160 4845 scope.go:117] "RemoveContainer" containerID="f824758afe3c663578c7666b1f094db1caee520920ba737c52e548b07f3ee9ad" Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.623861 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:56:20 crc kubenswrapper[4845]: I0202 10:56:20.634302 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-hvdzc"] Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584292 4845 generic.go:334] "Generic (PLEG): container finished" podID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerID="5f868885fdf4c7ccdaf9be4e2929167b43f5c44340ad27b9a68723598a5a4d1d" exitCode=0 Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584610 4845 generic.go:334] "Generic (PLEG): container finished" podID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerID="a59beb72591810706d3416b2fffeb070930f113386bdfdb4800fa23b079aa1da" exitCode=2 Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584619 4845 generic.go:334] "Generic (PLEG): container finished" podID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerID="2260ccaf2924a7de756705c5ed9aff5c1bd71d1d2fc36f15c40d4d5dc70e271f" exitCode=0 Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584366 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerDied","Data":"5f868885fdf4c7ccdaf9be4e2929167b43f5c44340ad27b9a68723598a5a4d1d"} Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584667 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerDied","Data":"a59beb72591810706d3416b2fffeb070930f113386bdfdb4800fa23b079aa1da"} Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.584685 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerDied","Data":"2260ccaf2924a7de756705c5ed9aff5c1bd71d1d2fc36f15c40d4d5dc70e271f"} Feb 02 10:56:21 crc kubenswrapper[4845]: I0202 10:56:21.731439 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" path="/var/lib/kubelet/pods/83cd6f6d-3615-46e0-875a-e1cec10e9631/volumes" Feb 02 10:56:24 crc kubenswrapper[4845]: I0202 10:56:24.631934 4845 generic.go:334] "Generic (PLEG): container finished" podID="2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" containerID="8584a98006ccb553e74a988ef9d575c49271ebfc304ac836fbd4745ff8e13b8d" exitCode=0 Feb 02 10:56:24 crc kubenswrapper[4845]: I0202 10:56:24.632055 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5zwdv" event={"ID":"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b","Type":"ContainerDied","Data":"8584a98006ccb553e74a988ef9d575c49271ebfc304ac836fbd4745ff8e13b8d"} Feb 02 10:56:25 crc kubenswrapper[4845]: I0202 10:56:25.649049 4845 generic.go:334] "Generic (PLEG): container finished" podID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerID="27f273dfdbc6f48a241cefa433f71908752cdbef7c94c8988ca4fbc3330d687c" exitCode=0 Feb 02 10:56:25 crc kubenswrapper[4845]: I0202 10:56:25.649127 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerDied","Data":"27f273dfdbc6f48a241cefa433f71908752cdbef7c94c8988ca4fbc3330d687c"} Feb 02 10:56:25 crc kubenswrapper[4845]: I0202 10:56:25.972222 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.135296 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.135693 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.135757 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.135838 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.135857 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.136022 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.136131 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrkdz\" (UniqueName: \"kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.136249 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.136362 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts\") pod \"41f8f678-af44-4ddc-b2db-01f96bae8601\" (UID: \"41f8f678-af44-4ddc-b2db-01f96bae8601\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.136539 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.137261 4845 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.137289 4845 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/41f8f678-af44-4ddc-b2db-01f96bae8601-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.143289 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz" (OuterVolumeSpecName: "kube-api-access-rrkdz") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "kube-api-access-rrkdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.144336 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts" (OuterVolumeSpecName: "scripts") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.146956 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.207672 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.238835 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle\") pod \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.239115 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8k2h\" (UniqueName: \"kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h\") pod \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.239250 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts\") pod \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.239345 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data\") pod \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\" (UID: \"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b\") " Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.240309 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrkdz\" (UniqueName: \"kubernetes.io/projected/41f8f678-af44-4ddc-b2db-01f96bae8601-kube-api-access-rrkdz\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.240338 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.240353 4845 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.243505 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts" (OuterVolumeSpecName: "scripts") pod "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" (UID: "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.243531 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h" (OuterVolumeSpecName: "kube-api-access-j8k2h") pod "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" (UID: "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b"). InnerVolumeSpecName "kube-api-access-j8k2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.245519 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.274360 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data" (OuterVolumeSpecName: "config-data") pod "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" (UID: "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.274905 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" (UID: "2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.276257 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.311693 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data" (OuterVolumeSpecName: "config-data") pod "41f8f678-af44-4ddc-b2db-01f96bae8601" (UID: "41f8f678-af44-4ddc-b2db-01f96bae8601"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342469 4845 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342516 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342529 4845 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342542 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342552 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342560 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41f8f678-af44-4ddc-b2db-01f96bae8601-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.342569 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8k2h\" (UniqueName: \"kubernetes.io/projected/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b-kube-api-access-j8k2h\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.676122 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"41f8f678-af44-4ddc-b2db-01f96bae8601","Type":"ContainerDied","Data":"568792204d8f6e91541e175f4c66f405089f0798e5551671c568bec6667a39ac"} Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.676209 4845 scope.go:117] "RemoveContainer" containerID="5f868885fdf4c7ccdaf9be4e2929167b43f5c44340ad27b9a68723598a5a4d1d" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.677216 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.683379 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5zwdv" event={"ID":"2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b","Type":"ContainerDied","Data":"29f5f8cab78cd261d0abd37ffd5974b0926bb3d10de649b68d9d21e2b1208d82"} Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.683421 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29f5f8cab78cd261d0abd37ffd5974b0926bb3d10de649b68d9d21e2b1208d82" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.683487 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5zwdv" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.723315 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.724504 4845 scope.go:117] "RemoveContainer" containerID="a59beb72591810706d3416b2fffeb070930f113386bdfdb4800fa23b079aa1da" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.744295 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.757638 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759057 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-notification-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759195 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-notification-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759275 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="sg-core" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759325 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="sg-core" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759417 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="dnsmasq-dns" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759466 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="dnsmasq-dns" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759517 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" containerName="nova-manage" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759572 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" containerName="nova-manage" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759710 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="proxy-httpd" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759792 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="proxy-httpd" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.759903 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-central-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.759986 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-central-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: E0202 10:56:26.760066 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="init" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760123 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="init" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760500 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-notification-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760567 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="sg-core" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760638 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="ceilometer-central-agent" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760707 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="83cd6f6d-3615-46e0-875a-e1cec10e9631" containerName="dnsmasq-dns" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760768 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" containerName="proxy-httpd" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.760834 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" containerName="nova-manage" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.763475 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.768412 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.768554 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.770608 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.776096 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.791076 4845 scope.go:117] "RemoveContainer" containerID="2260ccaf2924a7de756705c5ed9aff5c1bd71d1d2fc36f15c40d4d5dc70e271f" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.819749 4845 scope.go:117] "RemoveContainer" containerID="27f273dfdbc6f48a241cefa433f71908752cdbef7c94c8988ca4fbc3330d687c" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.854719 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-scripts\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.854787 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860035 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-config-data\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860130 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frkbj\" (UniqueName: \"kubernetes.io/projected/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-kube-api-access-frkbj\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860158 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-log-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860256 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-run-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860328 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.860447 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.938680 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.939171 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="652d8576-912d-4384-b487-aa6b987b567f" containerName="nova-scheduler-scheduler" containerID="cri-o://444f6ac6315562315a504c0600f21ab36cd45bf454f4002a51c1be86e2c6f5bb" gracePeriod=30 Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963153 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-scripts\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963210 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963288 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-config-data\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963318 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frkbj\" (UniqueName: \"kubernetes.io/projected/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-kube-api-access-frkbj\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963338 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-log-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963361 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-run-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963389 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963431 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963755 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.963849 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-log-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.964143 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-run-httpd\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.964305 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-log" containerID="cri-o://64b410e414340d61d34d86ceee1eabcc7870da5967de863190b7a1e715af98f3" gracePeriod=30 Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.964546 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-api" containerID="cri-o://a76cab8c49a716e1851e502e56d3f3205dc6fe1e003ead53af25ac18bdab30b7" gracePeriod=30 Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.968371 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.969463 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.969564 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.973627 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-config-data\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.982636 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-scripts\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.985898 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.986170 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" containerID="cri-o://d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79" gracePeriod=30 Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.986317 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" containerID="cri-o://1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d" gracePeriod=30 Feb 02 10:56:26 crc kubenswrapper[4845]: I0202 10:56:26.999699 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frkbj\" (UniqueName: \"kubernetes.io/projected/813ec32b-5cd3-491d-85ac-bcf0140d0a8f-kube-api-access-frkbj\") pod \"ceilometer-0\" (UID: \"813ec32b-5cd3-491d-85ac-bcf0140d0a8f\") " pod="openstack/ceilometer-0" Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.083941 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.671734 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:56:27 crc kubenswrapper[4845]: W0202 10:56:27.676044 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod813ec32b_5cd3_491d_85ac_bcf0140d0a8f.slice/crio-49d1e4a21ffc7b3a8cdc928016c2ee2786bd0f5de81b8f9bf0cd6f9da50cc727 WatchSource:0}: Error finding container 49d1e4a21ffc7b3a8cdc928016c2ee2786bd0f5de81b8f9bf0cd6f9da50cc727: Status 404 returned error can't find the container with id 49d1e4a21ffc7b3a8cdc928016c2ee2786bd0f5de81b8f9bf0cd6f9da50cc727 Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.741560 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41f8f678-af44-4ddc-b2db-01f96bae8601" path="/var/lib/kubelet/pods/41f8f678-af44-4ddc-b2db-01f96bae8601/volumes" Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.742519 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"813ec32b-5cd3-491d-85ac-bcf0140d0a8f","Type":"ContainerStarted","Data":"49d1e4a21ffc7b3a8cdc928016c2ee2786bd0f5de81b8f9bf0cd6f9da50cc727"} Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.756954 4845 generic.go:334] "Generic (PLEG): container finished" podID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerID="d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79" exitCode=143 Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.757063 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerDied","Data":"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79"} Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.780457 4845 generic.go:334] "Generic (PLEG): container finished" podID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerID="a76cab8c49a716e1851e502e56d3f3205dc6fe1e003ead53af25ac18bdab30b7" exitCode=0 Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.780499 4845 generic.go:334] "Generic (PLEG): container finished" podID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerID="64b410e414340d61d34d86ceee1eabcc7870da5967de863190b7a1e715af98f3" exitCode=143 Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.780523 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerDied","Data":"a76cab8c49a716e1851e502e56d3f3205dc6fe1e003ead53af25ac18bdab30b7"} Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.780589 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerDied","Data":"64b410e414340d61d34d86ceee1eabcc7870da5967de863190b7a1e715af98f3"} Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.844586 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.997935 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wfh5\" (UniqueName: \"kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.997992 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.998108 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.998142 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.998350 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.998417 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs\") pod \"722f6d8a-3c97-4061-b26a-f8ec00f65006\" (UID: \"722f6d8a-3c97-4061-b26a-f8ec00f65006\") " Feb 02 10:56:27 crc kubenswrapper[4845]: I0202 10:56:27.999492 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs" (OuterVolumeSpecName: "logs") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.042148 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5" (OuterVolumeSpecName: "kube-api-access-8wfh5") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "kube-api-access-8wfh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.053943 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data" (OuterVolumeSpecName: "config-data") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.074032 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.101327 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wfh5\" (UniqueName: \"kubernetes.io/projected/722f6d8a-3c97-4061-b26a-f8ec00f65006-kube-api-access-8wfh5\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.101352 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.101364 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.101372 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722f6d8a-3c97-4061-b26a-f8ec00f65006-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.101439 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.150799 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "722f6d8a-3c97-4061-b26a-f8ec00f65006" (UID: "722f6d8a-3c97-4061-b26a-f8ec00f65006"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.203069 4845 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.203094 4845 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722f6d8a-3c97-4061-b26a-f8ec00f65006-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.805688 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"813ec32b-5cd3-491d-85ac-bcf0140d0a8f","Type":"ContainerStarted","Data":"da85dccaa6e812287f989d1879096e61fb210356881e2cf93d8e838b865daec8"} Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.811669 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"722f6d8a-3c97-4061-b26a-f8ec00f65006","Type":"ContainerDied","Data":"33b69e32755cf4bbe30b319fc4768886cb90925b277cb1bb998366cf1a27877f"} Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.811740 4845 scope.go:117] "RemoveContainer" containerID="a76cab8c49a716e1851e502e56d3f3205dc6fe1e003ead53af25ac18bdab30b7" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.811916 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.866387 4845 generic.go:334] "Generic (PLEG): container finished" podID="652d8576-912d-4384-b487-aa6b987b567f" containerID="444f6ac6315562315a504c0600f21ab36cd45bf454f4002a51c1be86e2c6f5bb" exitCode=0 Feb 02 10:56:28 crc kubenswrapper[4845]: I0202 10:56:28.866600 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"652d8576-912d-4384-b487-aa6b987b567f","Type":"ContainerDied","Data":"444f6ac6315562315a504c0600f21ab36cd45bf454f4002a51c1be86e2c6f5bb"} Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.013739 4845 scope.go:117] "RemoveContainer" containerID="64b410e414340d61d34d86ceee1eabcc7870da5967de863190b7a1e715af98f3" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.044838 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.060115 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.075314 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: E0202 10:56:29.076244 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-log" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.076309 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-log" Feb 02 10:56:29 crc kubenswrapper[4845]: E0202 10:56:29.076390 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-api" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.076449 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-api" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.076767 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-api" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.077123 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" containerName="nova-api-log" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.077081 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.077780 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="652d8576-912d-4384-b487-aa6b987b567f" containerName="nova-scheduler-scheduler" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.078724 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.085323 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.085607 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.085816 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.103207 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.248491 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle\") pod \"652d8576-912d-4384-b487-aa6b987b567f\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.248763 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnk77\" (UniqueName: \"kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77\") pod \"652d8576-912d-4384-b487-aa6b987b567f\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.248919 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data\") pod \"652d8576-912d-4384-b487-aa6b987b567f\" (UID: \"652d8576-912d-4384-b487-aa6b987b567f\") " Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249252 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249434 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-config-data\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249661 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-569rd\" (UniqueName: \"kubernetes.io/projected/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-kube-api-access-569rd\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249897 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-logs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.249941 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-public-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.258447 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77" (OuterVolumeSpecName: "kube-api-access-vnk77") pod "652d8576-912d-4384-b487-aa6b987b567f" (UID: "652d8576-912d-4384-b487-aa6b987b567f"). InnerVolumeSpecName "kube-api-access-vnk77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.286929 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data" (OuterVolumeSpecName: "config-data") pod "652d8576-912d-4384-b487-aa6b987b567f" (UID: "652d8576-912d-4384-b487-aa6b987b567f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.288437 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "652d8576-912d-4384-b487-aa6b987b567f" (UID: "652d8576-912d-4384-b487-aa6b987b567f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352314 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-config-data\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352385 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352410 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-569rd\" (UniqueName: \"kubernetes.io/projected/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-kube-api-access-569rd\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352478 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-logs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352506 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-public-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352600 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnk77\" (UniqueName: \"kubernetes.io/projected/652d8576-912d-4384-b487-aa6b987b567f-kube-api-access-vnk77\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352622 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.352634 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652d8576-912d-4384-b487-aa6b987b567f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.353648 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-logs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.356129 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.356579 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-public-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.357524 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.362708 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-config-data\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.371817 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-569rd\" (UniqueName: \"kubernetes.io/projected/953beda6-58f2-45c2-b34e-0cb7db2d3bf6-kube-api-access-569rd\") pod \"nova-api-0\" (UID: \"953beda6-58f2-45c2-b34e-0cb7db2d3bf6\") " pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.410152 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.732366 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722f6d8a-3c97-4061-b26a-f8ec00f65006" path="/var/lib/kubelet/pods/722f6d8a-3c97-4061-b26a-f8ec00f65006/volumes" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.884007 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"652d8576-912d-4384-b487-aa6b987b567f","Type":"ContainerDied","Data":"9d162ecc4fb3460600a969e51ca75c36688e2c03aaec73046e664f22f56be6a6"} Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.884028 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.884089 4845 scope.go:117] "RemoveContainer" containerID="444f6ac6315562315a504c0600f21ab36cd45bf454f4002a51c1be86e2c6f5bb" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.887102 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"813ec32b-5cd3-491d-85ac-bcf0140d0a8f","Type":"ContainerStarted","Data":"bde40b59bdf4479ccfb44f68da737645aab1a9c63a750d7f0ef0530fe8b00b04"} Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.911440 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.924913 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.949150 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:29 crc kubenswrapper[4845]: E0202 10:56:29.949925 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="652d8576-912d-4384-b487-aa6b987b567f" containerName="nova-scheduler-scheduler" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.950008 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="652d8576-912d-4384-b487-aa6b987b567f" containerName="nova-scheduler-scheduler" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.951160 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.953488 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:56:29 crc kubenswrapper[4845]: I0202 10:56:29.981194 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.043511 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.071638 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.072156 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-config-data\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.072359 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xsgp\" (UniqueName: \"kubernetes.io/projected/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-kube-api-access-7xsgp\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.169892 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": read tcp 10.217.0.2:58524->10.217.0.249:8775: read: connection reset by peer" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.169952 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": read tcp 10.217.0.2:58528->10.217.0.249:8775: read: connection reset by peer" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.175515 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.175778 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-config-data\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.175834 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xsgp\" (UniqueName: \"kubernetes.io/projected/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-kube-api-access-7xsgp\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.181638 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-config-data\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.182399 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.200632 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xsgp\" (UniqueName: \"kubernetes.io/projected/b3eed39b-ccd7-4c3d-bbd8-6872503e1c60-kube-api-access-7xsgp\") pod \"nova-scheduler-0\" (UID: \"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60\") " pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.289190 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.766471 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.794431 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle\") pod \"ab48fc91-e9f1-4362-8cb8-091846601a7e\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.847356 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab48fc91-e9f1-4362-8cb8-091846601a7e" (UID: "ab48fc91-e9f1-4362-8cb8-091846601a7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.875997 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.899365 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs\") pod \"ab48fc91-e9f1-4362-8cb8-091846601a7e\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.899672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlh7l\" (UniqueName: \"kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l\") pod \"ab48fc91-e9f1-4362-8cb8-091846601a7e\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.899729 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs\") pod \"ab48fc91-e9f1-4362-8cb8-091846601a7e\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.899790 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data\") pod \"ab48fc91-e9f1-4362-8cb8-091846601a7e\" (UID: \"ab48fc91-e9f1-4362-8cb8-091846601a7e\") " Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.900470 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.900676 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs" (OuterVolumeSpecName: "logs") pod "ab48fc91-e9f1-4362-8cb8-091846601a7e" (UID: "ab48fc91-e9f1-4362-8cb8-091846601a7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.904485 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l" (OuterVolumeSpecName: "kube-api-access-zlh7l") pod "ab48fc91-e9f1-4362-8cb8-091846601a7e" (UID: "ab48fc91-e9f1-4362-8cb8-091846601a7e"). InnerVolumeSpecName "kube-api-access-zlh7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.938349 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"813ec32b-5cd3-491d-85ac-bcf0140d0a8f","Type":"ContainerStarted","Data":"a4b5e74bdb454e472c786f4693fd9b9111aaef9ac14fd8f77691e8053e36adae"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.946429 4845 generic.go:334] "Generic (PLEG): container finished" podID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerID="1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d" exitCode=0 Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.946497 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerDied","Data":"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.946526 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ab48fc91-e9f1-4362-8cb8-091846601a7e","Type":"ContainerDied","Data":"03916b40dcfb1d58e9a1856cf115e52af0968b63f0ef6c849ed5ef4cac5ddc0d"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.946544 4845 scope.go:117] "RemoveContainer" containerID="1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.946671 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.956145 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"953beda6-58f2-45c2-b34e-0cb7db2d3bf6","Type":"ContainerStarted","Data":"fe20f1006932a35e64b4cabe966be37477818387fbbdcf4c4237daac6a8ab17d"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.956198 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"953beda6-58f2-45c2-b34e-0cb7db2d3bf6","Type":"ContainerStarted","Data":"e749a478c4e428d3146f39065a93a280166d6689eb0197b0e6f5762e7e29c2d0"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.956217 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"953beda6-58f2-45c2-b34e-0cb7db2d3bf6","Type":"ContainerStarted","Data":"91e099fe3253140f734d5bfb54b7af2f37d445390a9f68543fb5261b3d4e23bb"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.968012 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60","Type":"ContainerStarted","Data":"75337218b7cb847596bd50c9548b2fcbaf05e00356b9ac1ca423ad46c58a0121"} Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.995397 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data" (OuterVolumeSpecName: "config-data") pod "ab48fc91-e9f1-4362-8cb8-091846601a7e" (UID: "ab48fc91-e9f1-4362-8cb8-091846601a7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:30 crc kubenswrapper[4845]: I0202 10:56:30.995553 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.995534993 podStartE2EDuration="2.995534993s" podCreationTimestamp="2026-02-02 10:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:30.983995497 +0000 UTC m=+1472.075396947" watchObservedRunningTime="2026-02-02 10:56:30.995534993 +0000 UTC m=+1472.086936443" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.003367 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlh7l\" (UniqueName: \"kubernetes.io/projected/ab48fc91-e9f1-4362-8cb8-091846601a7e-kube-api-access-zlh7l\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.003414 4845 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab48fc91-e9f1-4362-8cb8-091846601a7e-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.003427 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.008433 4845 scope.go:117] "RemoveContainer" containerID="d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.024191 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ab48fc91-e9f1-4362-8cb8-091846601a7e" (UID: "ab48fc91-e9f1-4362-8cb8-091846601a7e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.044213 4845 scope.go:117] "RemoveContainer" containerID="1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d" Feb 02 10:56:31 crc kubenswrapper[4845]: E0202 10:56:31.044994 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d\": container with ID starting with 1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d not found: ID does not exist" containerID="1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.045051 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d"} err="failed to get container status \"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d\": rpc error: code = NotFound desc = could not find container \"1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d\": container with ID starting with 1179e5d38c7711858861d519e333bd2170cf5cc0718d6497caa7d2116458f55d not found: ID does not exist" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.045076 4845 scope.go:117] "RemoveContainer" containerID="d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79" Feb 02 10:56:31 crc kubenswrapper[4845]: E0202 10:56:31.045722 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79\": container with ID starting with d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79 not found: ID does not exist" containerID="d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.045826 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79"} err="failed to get container status \"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79\": rpc error: code = NotFound desc = could not find container \"d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79\": container with ID starting with d699d40e07287b19d500020a4c6395318b1aca063b567a81959a1c61c06a6c79 not found: ID does not exist" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.106554 4845 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab48fc91-e9f1-4362-8cb8-091846601a7e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.336860 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.383113 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.395260 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:31 crc kubenswrapper[4845]: E0202 10:56:31.395748 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.395790 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" Feb 02 10:56:31 crc kubenswrapper[4845]: E0202 10:56:31.395824 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.395831 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.396061 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-log" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.396089 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" containerName="nova-metadata-metadata" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.397558 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.404662 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.406346 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.408631 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.448167 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-config-data\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.448226 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-logs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.448306 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.449391 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.449716 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cvqp\" (UniqueName: \"kubernetes.io/projected/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-kube-api-access-2cvqp\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.552075 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.552194 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cvqp\" (UniqueName: \"kubernetes.io/projected/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-kube-api-access-2cvqp\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.552256 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-config-data\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.552280 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-logs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.552330 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.554322 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-logs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.558304 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.560280 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-config-data\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.561246 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.576556 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cvqp\" (UniqueName: \"kubernetes.io/projected/12adbd4d-efe1-4549-bcac-f2b5f14f18b9-kube-api-access-2cvqp\") pod \"nova-metadata-0\" (UID: \"12adbd4d-efe1-4549-bcac-f2b5f14f18b9\") " pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.719276 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.728255 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652d8576-912d-4384-b487-aa6b987b567f" path="/var/lib/kubelet/pods/652d8576-912d-4384-b487-aa6b987b567f/volumes" Feb 02 10:56:31 crc kubenswrapper[4845]: I0202 10:56:31.728855 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab48fc91-e9f1-4362-8cb8-091846601a7e" path="/var/lib/kubelet/pods/ab48fc91-e9f1-4362-8cb8-091846601a7e/volumes" Feb 02 10:56:32 crc kubenswrapper[4845]: I0202 10:56:32.037447 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3eed39b-ccd7-4c3d-bbd8-6872503e1c60","Type":"ContainerStarted","Data":"396b0203d72671aa7e4b87c3146da1335234c9ff74aba0fb3c6f0763fb91e65e"} Feb 02 10:56:32 crc kubenswrapper[4845]: I0202 10:56:32.096570 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.096540205 podStartE2EDuration="3.096540205s" podCreationTimestamp="2026-02-02 10:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:32.084394071 +0000 UTC m=+1473.175795521" watchObservedRunningTime="2026-02-02 10:56:32.096540205 +0000 UTC m=+1473.187941655" Feb 02 10:56:32 crc kubenswrapper[4845]: W0202 10:56:32.483550 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12adbd4d_efe1_4549_bcac_f2b5f14f18b9.slice/crio-bd87547cfe087444a3db7a133aef38b2f4c42f0b9d61af93217c2519a0a041e8 WatchSource:0}: Error finding container bd87547cfe087444a3db7a133aef38b2f4c42f0b9d61af93217c2519a0a041e8: Status 404 returned error can't find the container with id bd87547cfe087444a3db7a133aef38b2f4c42f0b9d61af93217c2519a0a041e8 Feb 02 10:56:32 crc kubenswrapper[4845]: I0202 10:56:32.485594 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.073397 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"813ec32b-5cd3-491d-85ac-bcf0140d0a8f","Type":"ContainerStarted","Data":"cfcdefd905ead75f76a06a4e18b3cf15176142f0f95015876c35874ddd232f1a"} Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.074225 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.086150 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12adbd4d-efe1-4549-bcac-f2b5f14f18b9","Type":"ContainerStarted","Data":"65475c23e9519977be49da9ac78ac49a2f12e0028a8de94546f0e723dfe9239d"} Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.086223 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12adbd4d-efe1-4549-bcac-f2b5f14f18b9","Type":"ContainerStarted","Data":"06489d103c7aad5e83dad1f6f603781de226b64d0f14299b12a2fcc051687cc2"} Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.086238 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"12adbd4d-efe1-4549-bcac-f2b5f14f18b9","Type":"ContainerStarted","Data":"bd87547cfe087444a3db7a133aef38b2f4c42f0b9d61af93217c2519a0a041e8"} Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.110696 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.957849022 podStartE2EDuration="7.110665434s" podCreationTimestamp="2026-02-02 10:56:26 +0000 UTC" firstStartedPulling="2026-02-02 10:56:27.682661301 +0000 UTC m=+1468.774062741" lastFinishedPulling="2026-02-02 10:56:31.835477703 +0000 UTC m=+1472.926879153" observedRunningTime="2026-02-02 10:56:33.099467947 +0000 UTC m=+1474.190869397" watchObservedRunningTime="2026-02-02 10:56:33.110665434 +0000 UTC m=+1474.202066884" Feb 02 10:56:33 crc kubenswrapper[4845]: I0202 10:56:33.135264 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.13523177 podStartE2EDuration="2.13523177s" podCreationTimestamp="2026-02-02 10:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:56:33.130651766 +0000 UTC m=+1474.222053216" watchObservedRunningTime="2026-02-02 10:56:33.13523177 +0000 UTC m=+1474.226633560" Feb 02 10:56:35 crc kubenswrapper[4845]: I0202 10:56:35.290608 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:56:36 crc kubenswrapper[4845]: I0202 10:56:36.719732 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:56:36 crc kubenswrapper[4845]: I0202 10:56:36.719800 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:56:39 crc kubenswrapper[4845]: I0202 10:56:39.412012 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:56:39 crc kubenswrapper[4845]: I0202 10:56:39.412566 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:56:40 crc kubenswrapper[4845]: I0202 10:56:40.290468 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:56:40 crc kubenswrapper[4845]: I0202 10:56:40.325605 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:56:40 crc kubenswrapper[4845]: I0202 10:56:40.420279 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="953beda6-58f2-45c2-b34e-0cb7db2d3bf6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:40 crc kubenswrapper[4845]: I0202 10:56:40.420281 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="953beda6-58f2-45c2-b34e-0cb7db2d3bf6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:41 crc kubenswrapper[4845]: I0202 10:56:41.348594 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:56:41 crc kubenswrapper[4845]: I0202 10:56:41.745332 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:56:41 crc kubenswrapper[4845]: I0202 10:56:41.745378 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:56:42 crc kubenswrapper[4845]: I0202 10:56:42.573151 4845 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c8162bea-daa1-42e3-8921-3c12ad56dfa6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.250:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:42 crc kubenswrapper[4845]: I0202 10:56:42.732129 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12adbd4d-efe1-4549-bcac-f2b5f14f18b9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:42 crc kubenswrapper[4845]: I0202 10:56:42.732127 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="12adbd4d-efe1-4549-bcac-f2b5f14f18b9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:56:46 crc kubenswrapper[4845]: I0202 10:56:46.237160 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:56:46 crc kubenswrapper[4845]: I0202 10:56:46.237842 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:49 crc kubenswrapper[4845]: I0202 10:56:49.417778 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:56:49 crc kubenswrapper[4845]: I0202 10:56:49.418644 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:56:49 crc kubenswrapper[4845]: I0202 10:56:49.425695 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:56:49 crc kubenswrapper[4845]: I0202 10:56:49.431033 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:56:50 crc kubenswrapper[4845]: I0202 10:56:50.297855 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:56:50 crc kubenswrapper[4845]: I0202 10:56:50.304482 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:56:51 crc kubenswrapper[4845]: I0202 10:56:51.727875 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:56:51 crc kubenswrapper[4845]: I0202 10:56:51.735499 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:56:51 crc kubenswrapper[4845]: I0202 10:56:51.736958 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:56:52 crc kubenswrapper[4845]: I0202 10:56:52.321742 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:56:57 crc kubenswrapper[4845]: I0202 10:56:57.098507 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:57:02 crc kubenswrapper[4845]: I0202 10:57:02.881242 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:02 crc kubenswrapper[4845]: I0202 10:57:02.886660 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:02 crc kubenswrapper[4845]: I0202 10:57:02.935279 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.009721 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.009908 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.010039 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8b6p\" (UniqueName: \"kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.112668 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.112844 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8b6p\" (UniqueName: \"kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.113057 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.113961 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.114018 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.142955 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8b6p\" (UniqueName: \"kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p\") pod \"redhat-operators-fqkz8\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.215946 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:03 crc kubenswrapper[4845]: I0202 10:57:03.776545 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:04 crc kubenswrapper[4845]: I0202 10:57:04.480175 4845 generic.go:334] "Generic (PLEG): container finished" podID="21872d04-38b8-449f-a478-0f534e3632e0" containerID="173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55" exitCode=0 Feb 02 10:57:04 crc kubenswrapper[4845]: I0202 10:57:04.480467 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerDied","Data":"173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55"} Feb 02 10:57:04 crc kubenswrapper[4845]: I0202 10:57:04.480497 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerStarted","Data":"79b771075a004dbc2e6ac2bf2681317d5263184874aaff3482381deacd0ed3a1"} Feb 02 10:57:04 crc kubenswrapper[4845]: I0202 10:57:04.483687 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:57:05 crc kubenswrapper[4845]: I0202 10:57:05.496417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerStarted","Data":"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c"} Feb 02 10:57:11 crc kubenswrapper[4845]: I0202 10:57:11.584965 4845 generic.go:334] "Generic (PLEG): container finished" podID="21872d04-38b8-449f-a478-0f534e3632e0" containerID="9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c" exitCode=0 Feb 02 10:57:11 crc kubenswrapper[4845]: I0202 10:57:11.585014 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerDied","Data":"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c"} Feb 02 10:57:12 crc kubenswrapper[4845]: I0202 10:57:12.601058 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerStarted","Data":"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c"} Feb 02 10:57:12 crc kubenswrapper[4845]: I0202 10:57:12.628410 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqkz8" podStartSLOduration=3.096168214 podStartE2EDuration="10.628369088s" podCreationTimestamp="2026-02-02 10:57:02 +0000 UTC" firstStartedPulling="2026-02-02 10:57:04.483378637 +0000 UTC m=+1505.574780087" lastFinishedPulling="2026-02-02 10:57:12.015579491 +0000 UTC m=+1513.106980961" observedRunningTime="2026-02-02 10:57:12.623837776 +0000 UTC m=+1513.715239236" watchObservedRunningTime="2026-02-02 10:57:12.628369088 +0000 UTC m=+1513.719770538" Feb 02 10:57:13 crc kubenswrapper[4845]: I0202 10:57:13.216598 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:13 crc kubenswrapper[4845]: I0202 10:57:13.216642 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:14 crc kubenswrapper[4845]: I0202 10:57:14.273740 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqkz8" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="registry-server" probeResult="failure" output=< Feb 02 10:57:14 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 10:57:14 crc kubenswrapper[4845]: > Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.237953 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.238351 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.238422 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.239756 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.239869 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" gracePeriod=600 Feb 02 10:57:16 crc kubenswrapper[4845]: E0202 10:57:16.364419 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.653734 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" exitCode=0 Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.653847 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020"} Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.654110 4845 scope.go:117] "RemoveContainer" containerID="6667d6885fd474a5baafce195af3c9008051b075b4b764b236fc396ff08f675c" Feb 02 10:57:16 crc kubenswrapper[4845]: I0202 10:57:16.654857 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:57:16 crc kubenswrapper[4845]: E0202 10:57:16.655245 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:57:23 crc kubenswrapper[4845]: I0202 10:57:23.277400 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:23 crc kubenswrapper[4845]: I0202 10:57:23.337689 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:23 crc kubenswrapper[4845]: I0202 10:57:23.519462 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:24 crc kubenswrapper[4845]: I0202 10:57:24.753131 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqkz8" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="registry-server" containerID="cri-o://b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c" gracePeriod=2 Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.544771 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.627453 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content\") pod \"21872d04-38b8-449f-a478-0f534e3632e0\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.627541 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8b6p\" (UniqueName: \"kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p\") pod \"21872d04-38b8-449f-a478-0f534e3632e0\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.627714 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities\") pod \"21872d04-38b8-449f-a478-0f534e3632e0\" (UID: \"21872d04-38b8-449f-a478-0f534e3632e0\") " Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.628816 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities" (OuterVolumeSpecName: "utilities") pod "21872d04-38b8-449f-a478-0f534e3632e0" (UID: "21872d04-38b8-449f-a478-0f534e3632e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.650412 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p" (OuterVolumeSpecName: "kube-api-access-j8b6p") pod "21872d04-38b8-449f-a478-0f534e3632e0" (UID: "21872d04-38b8-449f-a478-0f534e3632e0"). InnerVolumeSpecName "kube-api-access-j8b6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.732015 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8b6p\" (UniqueName: \"kubernetes.io/projected/21872d04-38b8-449f-a478-0f534e3632e0-kube-api-access-j8b6p\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.732050 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.760401 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21872d04-38b8-449f-a478-0f534e3632e0" (UID: "21872d04-38b8-449f-a478-0f534e3632e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.771700 4845 generic.go:334] "Generic (PLEG): container finished" podID="21872d04-38b8-449f-a478-0f534e3632e0" containerID="b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c" exitCode=0 Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.771791 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerDied","Data":"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c"} Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.771815 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqkz8" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.771857 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqkz8" event={"ID":"21872d04-38b8-449f-a478-0f534e3632e0","Type":"ContainerDied","Data":"79b771075a004dbc2e6ac2bf2681317d5263184874aaff3482381deacd0ed3a1"} Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.771892 4845 scope.go:117] "RemoveContainer" containerID="b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.820767 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.826404 4845 scope.go:117] "RemoveContainer" containerID="9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.832488 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqkz8"] Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.834040 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21872d04-38b8-449f-a478-0f534e3632e0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.895087 4845 scope.go:117] "RemoveContainer" containerID="173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.929171 4845 scope.go:117] "RemoveContainer" containerID="b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c" Feb 02 10:57:25 crc kubenswrapper[4845]: E0202 10:57:25.931415 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c\": container with ID starting with b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c not found: ID does not exist" containerID="b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.931527 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c"} err="failed to get container status \"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c\": rpc error: code = NotFound desc = could not find container \"b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c\": container with ID starting with b567c51caf362bac80b3b2be5b2875445ac9c9e670a723c3644ebd9069cd6d5c not found: ID does not exist" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.931578 4845 scope.go:117] "RemoveContainer" containerID="9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c" Feb 02 10:57:25 crc kubenswrapper[4845]: E0202 10:57:25.932494 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c\": container with ID starting with 9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c not found: ID does not exist" containerID="9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.932546 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c"} err="failed to get container status \"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c\": rpc error: code = NotFound desc = could not find container \"9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c\": container with ID starting with 9096f25597271a6576a0e09c4f4b246d4881d1a75f97e438185515d55a3a157c not found: ID does not exist" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.932580 4845 scope.go:117] "RemoveContainer" containerID="173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55" Feb 02 10:57:25 crc kubenswrapper[4845]: E0202 10:57:25.933255 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55\": container with ID starting with 173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55 not found: ID does not exist" containerID="173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55" Feb 02 10:57:25 crc kubenswrapper[4845]: I0202 10:57:25.933336 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55"} err="failed to get container status \"173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55\": rpc error: code = NotFound desc = could not find container \"173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55\": container with ID starting with 173a2a3b926b1d64073b7d4fd9a488aa0a3d36c047eaaf17d97e9644afaf5d55 not found: ID does not exist" Feb 02 10:57:27 crc kubenswrapper[4845]: I0202 10:57:27.725664 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21872d04-38b8-449f-a478-0f534e3632e0" path="/var/lib/kubelet/pods/21872d04-38b8-449f-a478-0f534e3632e0/volumes" Feb 02 10:57:29 crc kubenswrapper[4845]: I0202 10:57:29.721672 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:57:29 crc kubenswrapper[4845]: E0202 10:57:29.722322 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:57:42 crc kubenswrapper[4845]: I0202 10:57:42.714824 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:57:42 crc kubenswrapper[4845]: E0202 10:57:42.718289 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:57:53 crc kubenswrapper[4845]: I0202 10:57:53.713621 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:57:53 crc kubenswrapper[4845]: E0202 10:57:53.714448 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.473124 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:57:56 crc kubenswrapper[4845]: E0202 10:57:56.473959 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="registry-server" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.473973 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="registry-server" Feb 02 10:57:56 crc kubenswrapper[4845]: E0202 10:57:56.473991 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="extract-utilities" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.473998 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="extract-utilities" Feb 02 10:57:56 crc kubenswrapper[4845]: E0202 10:57:56.474031 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="extract-content" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.474038 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="extract-content" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.474275 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="21872d04-38b8-449f-a478-0f534e3632e0" containerName="registry-server" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.476191 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.488721 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.525470 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.525587 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzkz\" (UniqueName: \"kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.525741 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.628207 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.628375 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.628405 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzkz\" (UniqueName: \"kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.628752 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.628878 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.654452 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzkz\" (UniqueName: \"kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz\") pod \"community-operators-c8c2s\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:56 crc kubenswrapper[4845]: I0202 10:57:56.801686 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:57:57 crc kubenswrapper[4845]: I0202 10:57:57.321653 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:57:58 crc kubenswrapper[4845]: I0202 10:57:58.167101 4845 generic.go:334] "Generic (PLEG): container finished" podID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerID="337950f222b414591109cc0dc956079dd3cdadf690a2d7fa356d1e8cbf7a3dfa" exitCode=0 Feb 02 10:57:58 crc kubenswrapper[4845]: I0202 10:57:58.167383 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerDied","Data":"337950f222b414591109cc0dc956079dd3cdadf690a2d7fa356d1e8cbf7a3dfa"} Feb 02 10:57:58 crc kubenswrapper[4845]: I0202 10:57:58.167706 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerStarted","Data":"3ae1282b014c1aefd636e6ea9e87b1d088db9865278bfa8c37390fd589a8e356"} Feb 02 10:57:59 crc kubenswrapper[4845]: I0202 10:57:59.192129 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerStarted","Data":"5cde5d2d272073d7cd8963b7aae94db410880e27625574112c41080117c9fe2a"} Feb 02 10:58:01 crc kubenswrapper[4845]: I0202 10:58:01.220839 4845 generic.go:334] "Generic (PLEG): container finished" podID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerID="5cde5d2d272073d7cd8963b7aae94db410880e27625574112c41080117c9fe2a" exitCode=0 Feb 02 10:58:01 crc kubenswrapper[4845]: I0202 10:58:01.220914 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerDied","Data":"5cde5d2d272073d7cd8963b7aae94db410880e27625574112c41080117c9fe2a"} Feb 02 10:58:02 crc kubenswrapper[4845]: I0202 10:58:02.233601 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerStarted","Data":"5f2b7bb4369358a8d7e71aa3e4668d1ab3ee39e4a38064bf874407ad021a9394"} Feb 02 10:58:02 crc kubenswrapper[4845]: I0202 10:58:02.266002 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c8c2s" podStartSLOduration=2.807313402 podStartE2EDuration="6.265981146s" podCreationTimestamp="2026-02-02 10:57:56 +0000 UTC" firstStartedPulling="2026-02-02 10:57:58.1699929 +0000 UTC m=+1559.261394350" lastFinishedPulling="2026-02-02 10:58:01.628660644 +0000 UTC m=+1562.720062094" observedRunningTime="2026-02-02 10:58:02.252880804 +0000 UTC m=+1563.344282264" watchObservedRunningTime="2026-02-02 10:58:02.265981146 +0000 UTC m=+1563.357382606" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.717668 4845 scope.go:117] "RemoveContainer" containerID="13f6f84ab8aba03eaa86df418135b90d82987c32100a7069900fe6528abad51b" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.754683 4845 scope.go:117] "RemoveContainer" containerID="2fbcebcb43f4c40bf6cc312cfba0d8248bb4757b182770fa199288683be575a0" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.802794 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.802916 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.870343 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:06 crc kubenswrapper[4845]: I0202 10:58:06.877995 4845 scope.go:117] "RemoveContainer" containerID="9249fd2f5427ff24d36b33c0b16d83ec165b6227454b663781f59d842989c2da" Feb 02 10:58:07 crc kubenswrapper[4845]: I0202 10:58:07.336907 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:07 crc kubenswrapper[4845]: I0202 10:58:07.401277 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:58:08 crc kubenswrapper[4845]: I0202 10:58:08.713174 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:58:08 crc kubenswrapper[4845]: E0202 10:58:08.713797 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:58:09 crc kubenswrapper[4845]: I0202 10:58:09.310296 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c8c2s" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="registry-server" containerID="cri-o://5f2b7bb4369358a8d7e71aa3e4668d1ab3ee39e4a38064bf874407ad021a9394" gracePeriod=2 Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.337637 4845 generic.go:334] "Generic (PLEG): container finished" podID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerID="5f2b7bb4369358a8d7e71aa3e4668d1ab3ee39e4a38064bf874407ad021a9394" exitCode=0 Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.337701 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerDied","Data":"5f2b7bb4369358a8d7e71aa3e4668d1ab3ee39e4a38064bf874407ad021a9394"} Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.531413 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.724215 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzkz\" (UniqueName: \"kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz\") pod \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.724780 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities\") pod \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.725208 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content\") pod \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\" (UID: \"8d22eac5-ebbe-4f96-b316-2f8f285e525d\") " Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.725867 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities" (OuterVolumeSpecName: "utilities") pod "8d22eac5-ebbe-4f96-b316-2f8f285e525d" (UID: "8d22eac5-ebbe-4f96-b316-2f8f285e525d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.729130 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.734383 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz" (OuterVolumeSpecName: "kube-api-access-nnzkz") pod "8d22eac5-ebbe-4f96-b316-2f8f285e525d" (UID: "8d22eac5-ebbe-4f96-b316-2f8f285e525d"). InnerVolumeSpecName "kube-api-access-nnzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.787305 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d22eac5-ebbe-4f96-b316-2f8f285e525d" (UID: "8d22eac5-ebbe-4f96-b316-2f8f285e525d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.832985 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d22eac5-ebbe-4f96-b316-2f8f285e525d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:10 crc kubenswrapper[4845]: I0202 10:58:10.833024 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnzkz\" (UniqueName: \"kubernetes.io/projected/8d22eac5-ebbe-4f96-b316-2f8f285e525d-kube-api-access-nnzkz\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.354823 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c8c2s" event={"ID":"8d22eac5-ebbe-4f96-b316-2f8f285e525d","Type":"ContainerDied","Data":"3ae1282b014c1aefd636e6ea9e87b1d088db9865278bfa8c37390fd589a8e356"} Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.354968 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c8c2s" Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.355227 4845 scope.go:117] "RemoveContainer" containerID="5f2b7bb4369358a8d7e71aa3e4668d1ab3ee39e4a38064bf874407ad021a9394" Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.391773 4845 scope.go:117] "RemoveContainer" containerID="5cde5d2d272073d7cd8963b7aae94db410880e27625574112c41080117c9fe2a" Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.407913 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.423983 4845 scope.go:117] "RemoveContainer" containerID="337950f222b414591109cc0dc956079dd3cdadf690a2d7fa356d1e8cbf7a3dfa" Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.426521 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c8c2s"] Feb 02 10:58:11 crc kubenswrapper[4845]: I0202 10:58:11.726218 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" path="/var/lib/kubelet/pods/8d22eac5-ebbe-4f96-b316-2f8f285e525d/volumes" Feb 02 10:58:20 crc kubenswrapper[4845]: I0202 10:58:20.712318 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:58:20 crc kubenswrapper[4845]: E0202 10:58:20.713361 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:58:31 crc kubenswrapper[4845]: I0202 10:58:31.712762 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:58:31 crc kubenswrapper[4845]: E0202 10:58:31.713664 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:58:45 crc kubenswrapper[4845]: I0202 10:58:45.714335 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:58:45 crc kubenswrapper[4845]: E0202 10:58:45.715955 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:58:56 crc kubenswrapper[4845]: I0202 10:58:56.712815 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:58:56 crc kubenswrapper[4845]: E0202 10:58:56.713958 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:59:06 crc kubenswrapper[4845]: I0202 10:59:06.992479 4845 scope.go:117] "RemoveContainer" containerID="cc9a388001d07f3511088bc5b867a073370c23dfc566a76ed6b25957f4cd9611" Feb 02 10:59:07 crc kubenswrapper[4845]: I0202 10:59:07.045078 4845 scope.go:117] "RemoveContainer" containerID="423aeacd820ec5c2d675794591067804e90d1f2a6923ef3a9f13012b659813bc" Feb 02 10:59:07 crc kubenswrapper[4845]: I0202 10:59:07.092097 4845 scope.go:117] "RemoveContainer" containerID="1e24bbe2d8cd0583fc986cf6fd412f527daea58b4a85706eb389322bf6ad3af7" Feb 02 10:59:07 crc kubenswrapper[4845]: I0202 10:59:07.119485 4845 scope.go:117] "RemoveContainer" containerID="4c795b185512cbaa08a41087a290ab376088c31c6277e1a8f0ee3f21dc22200f" Feb 02 10:59:07 crc kubenswrapper[4845]: I0202 10:59:07.144907 4845 scope.go:117] "RemoveContainer" containerID="5c7e1e0f5ba6836be4b2cc0a23514474d1930ec71f3bf7b3e6b27bbccac7ee40" Feb 02 10:59:07 crc kubenswrapper[4845]: I0202 10:59:07.714745 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:59:07 crc kubenswrapper[4845]: E0202 10:59:07.715307 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:59:19 crc kubenswrapper[4845]: I0202 10:59:19.724029 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:59:19 crc kubenswrapper[4845]: E0202 10:59:19.724978 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:59:31 crc kubenswrapper[4845]: I0202 10:59:31.714417 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:59:31 crc kubenswrapper[4845]: E0202 10:59:31.715592 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:59:44 crc kubenswrapper[4845]: I0202 10:59:44.713413 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:59:44 crc kubenswrapper[4845]: E0202 10:59:44.714336 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 10:59:55 crc kubenswrapper[4845]: I0202 10:59:55.713355 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 10:59:55 crc kubenswrapper[4845]: E0202 10:59:55.714227 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.157869 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb"] Feb 02 11:00:00 crc kubenswrapper[4845]: E0202 11:00:00.159123 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="extract-content" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.159151 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="extract-content" Feb 02 11:00:00 crc kubenswrapper[4845]: E0202 11:00:00.159188 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.159196 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4845]: E0202 11:00:00.159223 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="extract-utilities" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.159232 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="extract-utilities" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.159528 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d22eac5-ebbe-4f96-b316-2f8f285e525d" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.160651 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.162866 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.163345 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.176169 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb"] Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.189349 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.189479 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.189619 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.292391 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.292475 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.292573 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.293783 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.308168 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.309741 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg\") pod \"collect-profiles-29500500-g5gxb\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.480895 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:00 crc kubenswrapper[4845]: I0202 11:00:00.989264 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb"] Feb 02 11:00:01 crc kubenswrapper[4845]: I0202 11:00:01.690276 4845 generic.go:334] "Generic (PLEG): container finished" podID="6b535e9d-4510-4191-9ab5-768d449b7bc3" containerID="e8b882d2679f84fe117d5aa26326dd7526ca6a2f74a7144d1db5d6a93a707b5a" exitCode=0 Feb 02 11:00:01 crc kubenswrapper[4845]: I0202 11:00:01.690464 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" event={"ID":"6b535e9d-4510-4191-9ab5-768d449b7bc3","Type":"ContainerDied","Data":"e8b882d2679f84fe117d5aa26326dd7526ca6a2f74a7144d1db5d6a93a707b5a"} Feb 02 11:00:01 crc kubenswrapper[4845]: I0202 11:00:01.690768 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" event={"ID":"6b535e9d-4510-4191-9ab5-768d449b7bc3","Type":"ContainerStarted","Data":"bab954d8ae8de5cb4daf181036b811bdc367d81b9d5a806b6002f216de90b8f0"} Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.115082 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.262452 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg\") pod \"6b535e9d-4510-4191-9ab5-768d449b7bc3\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.262565 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume\") pod \"6b535e9d-4510-4191-9ab5-768d449b7bc3\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.262663 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume\") pod \"6b535e9d-4510-4191-9ab5-768d449b7bc3\" (UID: \"6b535e9d-4510-4191-9ab5-768d449b7bc3\") " Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.265319 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b535e9d-4510-4191-9ab5-768d449b7bc3" (UID: "6b535e9d-4510-4191-9ab5-768d449b7bc3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.271057 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg" (OuterVolumeSpecName: "kube-api-access-7szmg") pod "6b535e9d-4510-4191-9ab5-768d449b7bc3" (UID: "6b535e9d-4510-4191-9ab5-768d449b7bc3"). InnerVolumeSpecName "kube-api-access-7szmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.271333 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b535e9d-4510-4191-9ab5-768d449b7bc3" (UID: "6b535e9d-4510-4191-9ab5-768d449b7bc3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.365958 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7szmg\" (UniqueName: \"kubernetes.io/projected/6b535e9d-4510-4191-9ab5-768d449b7bc3-kube-api-access-7szmg\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.365994 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b535e9d-4510-4191-9ab5-768d449b7bc3-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.366007 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b535e9d-4510-4191-9ab5-768d449b7bc3-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.713696 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.725342 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb" event={"ID":"6b535e9d-4510-4191-9ab5-768d449b7bc3","Type":"ContainerDied","Data":"bab954d8ae8de5cb4daf181036b811bdc367d81b9d5a806b6002f216de90b8f0"} Feb 02 11:00:03 crc kubenswrapper[4845]: I0202 11:00:03.725389 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bab954d8ae8de5cb4daf181036b811bdc367d81b9d5a806b6002f216de90b8f0" Feb 02 11:00:07 crc kubenswrapper[4845]: I0202 11:00:07.247748 4845 scope.go:117] "RemoveContainer" containerID="9eb0cc21db22e7b0a6e9194e496c67f804362485d557f665096269f5e604e637" Feb 02 11:00:07 crc kubenswrapper[4845]: I0202 11:00:07.279022 4845 scope.go:117] "RemoveContainer" containerID="7f93106b71edc6fc8f88297c4c620682fa2e4fe0213e9e1cad77a132eee7f48a" Feb 02 11:00:07 crc kubenswrapper[4845]: I0202 11:00:07.300698 4845 scope.go:117] "RemoveContainer" containerID="65f19a5e53861afad5be0dc22811e405a7089640d2466a967d0205688cbae9c3" Feb 02 11:00:07 crc kubenswrapper[4845]: I0202 11:00:07.326859 4845 scope.go:117] "RemoveContainer" containerID="1e8465847af87f5185fe9a371926a2dffc326ca101ce721ffdfe10ea1d45b3b5" Feb 02 11:00:10 crc kubenswrapper[4845]: I0202 11:00:10.714160 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:00:10 crc kubenswrapper[4845]: E0202 11:00:10.715082 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:00:21 crc kubenswrapper[4845]: I0202 11:00:21.713821 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:00:21 crc kubenswrapper[4845]: E0202 11:00:21.714581 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:00:36 crc kubenswrapper[4845]: I0202 11:00:36.712720 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:00:36 crc kubenswrapper[4845]: E0202 11:00:36.713870 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:00:48 crc kubenswrapper[4845]: I0202 11:00:48.713657 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:00:48 crc kubenswrapper[4845]: E0202 11:00:48.714567 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.171035 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500501-znk7q"] Feb 02 11:01:00 crc kubenswrapper[4845]: E0202 11:01:00.179644 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b535e9d-4510-4191-9ab5-768d449b7bc3" containerName="collect-profiles" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.180009 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b535e9d-4510-4191-9ab5-768d449b7bc3" containerName="collect-profiles" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.180433 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b535e9d-4510-4191-9ab5-768d449b7bc3" containerName="collect-profiles" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.181552 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.210135 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500501-znk7q"] Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.289317 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkts\" (UniqueName: \"kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.289603 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.289666 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.289769 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.392710 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.392858 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.392946 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkts\" (UniqueName: \"kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.393037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.399712 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.404496 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.404852 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.411760 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkts\" (UniqueName: \"kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts\") pod \"keystone-cron-29500501-znk7q\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:00 crc kubenswrapper[4845]: I0202 11:01:00.507076 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:01 crc kubenswrapper[4845]: I0202 11:01:01.017401 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500501-znk7q"] Feb 02 11:01:01 crc kubenswrapper[4845]: W0202 11:01:01.023386 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7303667b_89bb_4ad1_92a8_3c94525911d4.slice/crio-9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626 WatchSource:0}: Error finding container 9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626: Status 404 returned error can't find the container with id 9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626 Feb 02 11:01:01 crc kubenswrapper[4845]: I0202 11:01:01.413864 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-znk7q" event={"ID":"7303667b-89bb-4ad1-92a8-3c94525911d4","Type":"ContainerStarted","Data":"78e2917d9a9e1517466ea6ff81ddaca6478a9b6469927ef2ef8488e6ebc57d42"} Feb 02 11:01:01 crc kubenswrapper[4845]: I0202 11:01:01.414897 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-znk7q" event={"ID":"7303667b-89bb-4ad1-92a8-3c94525911d4","Type":"ContainerStarted","Data":"9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626"} Feb 02 11:01:01 crc kubenswrapper[4845]: I0202 11:01:01.443906 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500501-znk7q" podStartSLOduration=1.443869482 podStartE2EDuration="1.443869482s" podCreationTimestamp="2026-02-02 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:01:01.434127462 +0000 UTC m=+1742.525528922" watchObservedRunningTime="2026-02-02 11:01:01.443869482 +0000 UTC m=+1742.535270932" Feb 02 11:01:01 crc kubenswrapper[4845]: I0202 11:01:01.713562 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:01:01 crc kubenswrapper[4845]: E0202 11:01:01.714247 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:01:05 crc kubenswrapper[4845]: I0202 11:01:05.456926 4845 generic.go:334] "Generic (PLEG): container finished" podID="7303667b-89bb-4ad1-92a8-3c94525911d4" containerID="78e2917d9a9e1517466ea6ff81ddaca6478a9b6469927ef2ef8488e6ebc57d42" exitCode=0 Feb 02 11:01:05 crc kubenswrapper[4845]: I0202 11:01:05.457046 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-znk7q" event={"ID":"7303667b-89bb-4ad1-92a8-3c94525911d4","Type":"ContainerDied","Data":"78e2917d9a9e1517466ea6ff81ddaca6478a9b6469927ef2ef8488e6ebc57d42"} Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.901826 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.980528 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data\") pod \"7303667b-89bb-4ad1-92a8-3c94525911d4\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.980822 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle\") pod \"7303667b-89bb-4ad1-92a8-3c94525911d4\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.980988 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqkts\" (UniqueName: \"kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts\") pod \"7303667b-89bb-4ad1-92a8-3c94525911d4\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.981168 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys\") pod \"7303667b-89bb-4ad1-92a8-3c94525911d4\" (UID: \"7303667b-89bb-4ad1-92a8-3c94525911d4\") " Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.988874 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7303667b-89bb-4ad1-92a8-3c94525911d4" (UID: "7303667b-89bb-4ad1-92a8-3c94525911d4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:06 crc kubenswrapper[4845]: I0202 11:01:06.989922 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts" (OuterVolumeSpecName: "kube-api-access-rqkts") pod "7303667b-89bb-4ad1-92a8-3c94525911d4" (UID: "7303667b-89bb-4ad1-92a8-3c94525911d4"). InnerVolumeSpecName "kube-api-access-rqkts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.019816 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7303667b-89bb-4ad1-92a8-3c94525911d4" (UID: "7303667b-89bb-4ad1-92a8-3c94525911d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.050286 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data" (OuterVolumeSpecName: "config-data") pod "7303667b-89bb-4ad1-92a8-3c94525911d4" (UID: "7303667b-89bb-4ad1-92a8-3c94525911d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.085279 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.085325 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.085344 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqkts\" (UniqueName: \"kubernetes.io/projected/7303667b-89bb-4ad1-92a8-3c94525911d4-kube-api-access-rqkts\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.085357 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7303667b-89bb-4ad1-92a8-3c94525911d4-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.444323 4845 scope.go:117] "RemoveContainer" containerID="f97ac7a8db772c50e60d2fbaebaa59d3f1748dd362df2a5683272ada45ea3c75" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.471897 4845 scope.go:117] "RemoveContainer" containerID="3ebbd0d0a7130e66b17b81004ea1929c3e2d40fef4c6767da3df2300291382c2" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.506493 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500501-znk7q" event={"ID":"7303667b-89bb-4ad1-92a8-3c94525911d4","Type":"ContainerDied","Data":"9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626"} Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.506543 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb43210ee4d612528da59913bdb2ba664514714e9e2648d77d865df0d94f626" Feb 02 11:01:07 crc kubenswrapper[4845]: I0202 11:01:07.506602 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500501-znk7q" Feb 02 11:01:13 crc kubenswrapper[4845]: I0202 11:01:13.712837 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:01:13 crc kubenswrapper[4845]: E0202 11:01:13.713696 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:01:28 crc kubenswrapper[4845]: I0202 11:01:28.714947 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:01:28 crc kubenswrapper[4845]: E0202 11:01:28.716226 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:01:32 crc kubenswrapper[4845]: I0202 11:01:32.070097 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6d20-account-create-update-8zrt2"] Feb 02 11:01:32 crc kubenswrapper[4845]: I0202 11:01:32.082281 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q8crj"] Feb 02 11:01:32 crc kubenswrapper[4845]: I0202 11:01:32.096131 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6d20-account-create-update-8zrt2"] Feb 02 11:01:32 crc kubenswrapper[4845]: I0202 11:01:32.107498 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q8crj"] Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.037310 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-hchq8"] Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.064709 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-42da-account-create-update-dmqrb"] Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.090738 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-42da-account-create-update-dmqrb"] Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.102986 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-hchq8"] Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.725436 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8" path="/var/lib/kubelet/pods/05069a45-f3d6-43e9-bf29-2e3a3cbcc2d8/volumes" Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.726193 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f4db3a3-fdab-41f0-b675-26aaaa575769" path="/var/lib/kubelet/pods/1f4db3a3-fdab-41f0-b675-26aaaa575769/volumes" Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.726853 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802ba94f-17f1-4eed-93aa-95e5ffe1ea43" path="/var/lib/kubelet/pods/802ba94f-17f1-4eed-93aa-95e5ffe1ea43/volumes" Feb 02 11:01:33 crc kubenswrapper[4845]: I0202 11:01:33.727590 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e" path="/var/lib/kubelet/pods/82c7eb05-f8ef-40a5-b799-af8bfdfd9c4e/volumes" Feb 02 11:01:34 crc kubenswrapper[4845]: I0202 11:01:34.039303 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-2fba-account-create-update-57wqb"] Feb 02 11:01:34 crc kubenswrapper[4845]: I0202 11:01:34.051147 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-2fba-account-create-update-57wqb"] Feb 02 11:01:35 crc kubenswrapper[4845]: I0202 11:01:35.029459 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-wlplx"] Feb 02 11:01:35 crc kubenswrapper[4845]: I0202 11:01:35.043364 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-wlplx"] Feb 02 11:01:35 crc kubenswrapper[4845]: I0202 11:01:35.728890 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc" path="/var/lib/kubelet/pods/8d4f7cb3-0991-4ce6-a69d-fd6f17bbc2fc/volumes" Feb 02 11:01:35 crc kubenswrapper[4845]: I0202 11:01:35.729581 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc354af6-cf06-4532-83c7-845e6f8f41c5" path="/var/lib/kubelet/pods/fc354af6-cf06-4532-83c7-845e6f8f41c5/volumes" Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.036984 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4gj6v"] Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.056067 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1e2c-account-create-update-jxgpc"] Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.069613 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1e2c-account-create-update-jxgpc"] Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.083990 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4gj6v"] Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.723704 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:01:39 crc kubenswrapper[4845]: E0202 11:01:39.724285 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.724871 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13911fd9-043e-424e-ba84-da6af616a202" path="/var/lib/kubelet/pods/13911fd9-043e-424e-ba84-da6af616a202/volumes" Feb 02 11:01:39 crc kubenswrapper[4845]: I0202 11:01:39.725667 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa851884-d67b-4c70-8ad6-9dcf92001aa5" path="/var/lib/kubelet/pods/aa851884-d67b-4c70-8ad6-9dcf92001aa5/volumes" Feb 02 11:01:48 crc kubenswrapper[4845]: I0202 11:01:48.049227 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-c783-account-create-update-c8k62"] Feb 02 11:01:48 crc kubenswrapper[4845]: I0202 11:01:48.064805 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-z27qx"] Feb 02 11:01:48 crc kubenswrapper[4845]: I0202 11:01:48.075464 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-c783-account-create-update-c8k62"] Feb 02 11:01:48 crc kubenswrapper[4845]: I0202 11:01:48.086547 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-z27qx"] Feb 02 11:01:49 crc kubenswrapper[4845]: I0202 11:01:49.726542 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="712e6155-a77e-4f9c-9d55-a6edab62e9a7" path="/var/lib/kubelet/pods/712e6155-a77e-4f9c-9d55-a6edab62e9a7/volumes" Feb 02 11:01:49 crc kubenswrapper[4845]: I0202 11:01:49.727840 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0fdfb88-9683-4cc2-95f1-6ab55c558dfd" path="/var/lib/kubelet/pods/e0fdfb88-9683-4cc2-95f1-6ab55c558dfd/volumes" Feb 02 11:01:50 crc kubenswrapper[4845]: I0202 11:01:50.712941 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:01:50 crc kubenswrapper[4845]: E0202 11:01:50.713327 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:02:03 crc kubenswrapper[4845]: I0202 11:02:03.712792 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:02:03 crc kubenswrapper[4845]: E0202 11:02:03.713585 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:02:05 crc kubenswrapper[4845]: I0202 11:02:05.039043 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qqb26"] Feb 02 11:02:05 crc kubenswrapper[4845]: I0202 11:02:05.055929 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qqb26"] Feb 02 11:02:05 crc kubenswrapper[4845]: I0202 11:02:05.726454 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b9529c-8c20-47e9-8c19-910a31b30683" path="/var/lib/kubelet/pods/62b9529c-8c20-47e9-8c19-910a31b30683/volumes" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.549510 4845 scope.go:117] "RemoveContainer" containerID="2b3f2d6cbc2fbaafd3e7acd158c2d8862a6fa7d667477c7485ce11aa580584b6" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.586667 4845 scope.go:117] "RemoveContainer" containerID="e2da395f32221226555ec9a36b4c70b9ebd84972cf5fd496af49cd196e7172a2" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.640339 4845 scope.go:117] "RemoveContainer" containerID="7371036457c908a60d98a53f34d54f0d70618efbebcf97fbc3e3f1b041ae7110" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.704052 4845 scope.go:117] "RemoveContainer" containerID="54419ef52d04b18470afc9cd9fe1e7776568928b396efb56b1f79342767e7b05" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.758521 4845 scope.go:117] "RemoveContainer" containerID="8f242d53b8b45b41f78ed81a7dfba88bead6b72ed12835b100468efa95d3864d" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.844292 4845 scope.go:117] "RemoveContainer" containerID="a6fc1e9766e80a7882547aca33a905323116744093fc66e9ca25989843c77b7a" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.900116 4845 scope.go:117] "RemoveContainer" containerID="a93a594f3ef1f5fe22327995e92493ed98d1dba81262b5bcf617c4c84d0e3aba" Feb 02 11:02:07 crc kubenswrapper[4845]: I0202 11:02:07.952581 4845 scope.go:117] "RemoveContainer" containerID="f34968fe05a8bf94342bdc3d85ae1b9aa88e7cf9dc5bd5dd49c6ff1a1947185f" Feb 02 11:02:08 crc kubenswrapper[4845]: I0202 11:02:08.009843 4845 scope.go:117] "RemoveContainer" containerID="2206de97b65437900f5876967e6088126d1863f30884da3104dd45175b5b4a13" Feb 02 11:02:08 crc kubenswrapper[4845]: I0202 11:02:08.034047 4845 scope.go:117] "RemoveContainer" containerID="4440ff2b707ea7d06429dcadf4f2755be4ff5fe9e3d35fbb9b3d5449440ebcdb" Feb 02 11:02:08 crc kubenswrapper[4845]: I0202 11:02:08.067116 4845 scope.go:117] "RemoveContainer" containerID="cf759ca4ae8492477c32d3ee2af20b6a746da7ed09b5646264fc1f94d46044d6" Feb 02 11:02:08 crc kubenswrapper[4845]: I0202 11:02:08.098830 4845 scope.go:117] "RemoveContainer" containerID="3459391ca6853a62f9d027449b7b8a2cf779254301205a614a5d854038e87879" Feb 02 11:02:08 crc kubenswrapper[4845]: I0202 11:02:08.128108 4845 scope.go:117] "RemoveContainer" containerID="c7c52a689d78c280fc8b76cead8491d0d675cc8f51ac92d599f8ca929b26e719" Feb 02 11:02:11 crc kubenswrapper[4845]: I0202 11:02:11.054424 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-8ggwt"] Feb 02 11:02:11 crc kubenswrapper[4845]: I0202 11:02:11.069704 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-8ggwt"] Feb 02 11:02:11 crc kubenswrapper[4845]: I0202 11:02:11.730496 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367466e2-34f1-4f2c-9e11-eb6c24c5318c" path="/var/lib/kubelet/pods/367466e2-34f1-4f2c-9e11-eb6c24c5318c/volumes" Feb 02 11:02:12 crc kubenswrapper[4845]: I0202 11:02:12.034299 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-wnlhd"] Feb 02 11:02:12 crc kubenswrapper[4845]: I0202 11:02:12.047708 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h6qld"] Feb 02 11:02:12 crc kubenswrapper[4845]: I0202 11:02:12.059544 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-wnlhd"] Feb 02 11:02:12 crc kubenswrapper[4845]: I0202 11:02:12.075054 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h6qld"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.038497 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-phm7s"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.061808 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ee84-account-create-update-f2n87"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.072635 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ee84-account-create-update-f2n87"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.083381 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3a8c-account-create-update-qmrlh"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.093628 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-phm7s"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.103587 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3a8c-account-create-update-qmrlh"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.113922 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-edbb-account-create-update-gt7ll"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.124157 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-edbb-account-create-update-gt7ll"] Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.725901 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf66acf-0a94-4850-913b-711b19b88dd3" path="/var/lib/kubelet/pods/2cf66acf-0a94-4850-913b-711b19b88dd3/volumes" Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.726630 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e0fd8e-0f85-48be-b690-c11e3c09f340" path="/var/lib/kubelet/pods/37e0fd8e-0f85-48be-b690-c11e3c09f340/volumes" Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.727245 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e" path="/var/lib/kubelet/pods/45ea84b0-b5f2-4a74-8f6a-67b4176e5d1e/volumes" Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.727820 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1fe923-0409-4c3c-869c-9d0c09a2506a" path="/var/lib/kubelet/pods/ad1fe923-0409-4c3c-869c-9d0c09a2506a/volumes" Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.728867 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afec66f7-184b-44f1-a172-b1e78739309d" path="/var/lib/kubelet/pods/afec66f7-184b-44f1-a172-b1e78739309d/volumes" Feb 02 11:02:13 crc kubenswrapper[4845]: I0202 11:02:13.729483 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb890e0-ca91-4204-8e4b-9036a64e56e1" path="/var/lib/kubelet/pods/efb890e0-ca91-4204-8e4b-9036a64e56e1/volumes" Feb 02 11:02:14 crc kubenswrapper[4845]: I0202 11:02:14.712690 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:02:14 crc kubenswrapper[4845]: E0202 11:02:14.713037 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:02:17 crc kubenswrapper[4845]: I0202 11:02:17.034170 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-cba3-account-create-update-bph8b"] Feb 02 11:02:17 crc kubenswrapper[4845]: I0202 11:02:17.045756 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-cba3-account-create-update-bph8b"] Feb 02 11:02:17 crc kubenswrapper[4845]: I0202 11:02:17.726274 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af47917-824a-452b-b0db-03ad3f4861df" path="/var/lib/kubelet/pods/9af47917-824a-452b-b0db-03ad3f4861df/volumes" Feb 02 11:02:18 crc kubenswrapper[4845]: I0202 11:02:18.058267 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-kgn95"] Feb 02 11:02:18 crc kubenswrapper[4845]: I0202 11:02:18.073249 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-kgn95"] Feb 02 11:02:19 crc kubenswrapper[4845]: I0202 11:02:19.726046 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34877df4-b654-4e0c-ac67-da6fd95c249d" path="/var/lib/kubelet/pods/34877df4-b654-4e0c-ac67-da6fd95c249d/volumes" Feb 02 11:02:24 crc kubenswrapper[4845]: I0202 11:02:24.047779 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jbstq"] Feb 02 11:02:24 crc kubenswrapper[4845]: I0202 11:02:24.064945 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jbstq"] Feb 02 11:02:25 crc kubenswrapper[4845]: I0202 11:02:25.743249 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967b449a-1414-4a5c-b625-bcaf12b17ade" path="/var/lib/kubelet/pods/967b449a-1414-4a5c-b625-bcaf12b17ade/volumes" Feb 02 11:02:29 crc kubenswrapper[4845]: I0202 11:02:29.723825 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:02:30 crc kubenswrapper[4845]: I0202 11:02:30.698812 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e"} Feb 02 11:02:56 crc kubenswrapper[4845]: I0202 11:02:56.056247 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-cpdt4"] Feb 02 11:02:56 crc kubenswrapper[4845]: I0202 11:02:56.073145 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-cpdt4"] Feb 02 11:02:57 crc kubenswrapper[4845]: I0202 11:02:57.729456 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2" path="/var/lib/kubelet/pods/856e9eb8-5bdb-40d0-9ad7-dec8d2594fc2/volumes" Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.043743 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gxjbc"] Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.061559 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-f7js4"] Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.078204 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-f7js4"] Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.093041 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gxjbc"] Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.728252 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b" path="/var/lib/kubelet/pods/1cb74cc1-ae6a-46c7-8ce6-ba4d353ce47b/volumes" Feb 02 11:03:07 crc kubenswrapper[4845]: I0202 11:03:07.729520 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae3aa591-f1f0-4264-a970-d8172cc24781" path="/var/lib/kubelet/pods/ae3aa591-f1f0-4264-a970-d8172cc24781/volumes" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.037286 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-kxrm5"] Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.050541 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-kxrm5"] Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.400080 4845 scope.go:117] "RemoveContainer" containerID="c026c97f3f623cb46f500e205081203362ba8f0b275d0368c0ea74ac7d34d244" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.439648 4845 scope.go:117] "RemoveContainer" containerID="eaea8c79134a763eb67941fd983f9ce0e269d99bbc22b44421da606e30805a94" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.531720 4845 scope.go:117] "RemoveContainer" containerID="7d589ab274ae62a36101e94b25da0bcc5210eda8997714fcc496cd1866ddd622" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.583421 4845 scope.go:117] "RemoveContainer" containerID="5c1dc639a0ba7e9ddef0cb628d1e688f84eee0dbcad460df6e423cdfb04749bd" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.665306 4845 scope.go:117] "RemoveContainer" containerID="3a7ec3be2e02a83d1849c451b822789235edc5a4672079139f59894d6d036a70" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.747753 4845 scope.go:117] "RemoveContainer" containerID="a8fcb11c46488e7ab4d44f9b73e21b0ab99aab04b639ff39da8c3dcc0a64fd01" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.800067 4845 scope.go:117] "RemoveContainer" containerID="493cf0ea40e1ecb4c2bc2c0fc9bcd32cc6e220ecfec73e51aec90faf9abebac3" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.847787 4845 scope.go:117] "RemoveContainer" containerID="51bb841af27a85ca68abff1f32dcaf10c9ab7f03618a7c256ab9040498ec70ed" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.897508 4845 scope.go:117] "RemoveContainer" containerID="c4fb16737964c587428fe338ca95d6abb864bb90373d40cc0d8bd05a89c69fe2" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.926948 4845 scope.go:117] "RemoveContainer" containerID="e50f62cc9707edb269ffe6698207592dfbc48d0ade6ff635de9614b2c3d62a34" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.957984 4845 scope.go:117] "RemoveContainer" containerID="b23d76a377763a9782e43474e58cb72f1e55efdf39ac9b7aaca3beca20c268f7" Feb 02 11:03:08 crc kubenswrapper[4845]: I0202 11:03:08.996238 4845 scope.go:117] "RemoveContainer" containerID="f8e4a4bf00801e300e8c97b01bb80d6e16ede05a4fe2abe38abfdf7564fa62f4" Feb 02 11:03:09 crc kubenswrapper[4845]: I0202 11:03:09.037535 4845 scope.go:117] "RemoveContainer" containerID="055042cb2d15b6eebd454cdc9f356a4e981de86dda6b046eef4e17f7f79f827f" Feb 02 11:03:09 crc kubenswrapper[4845]: I0202 11:03:09.046836 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-hft5g"] Feb 02 11:03:09 crc kubenswrapper[4845]: I0202 11:03:09.064194 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-hft5g"] Feb 02 11:03:09 crc kubenswrapper[4845]: I0202 11:03:09.728932 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="250e18d9-cb14-4309-8d0c-fb341511dba6" path="/var/lib/kubelet/pods/250e18d9-cb14-4309-8d0c-fb341511dba6/volumes" Feb 02 11:03:09 crc kubenswrapper[4845]: I0202 11:03:09.729994 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9868fb5b-b18e-42b0-8532-6e6a55da71d2" path="/var/lib/kubelet/pods/9868fb5b-b18e-42b0-8532-6e6a55da71d2/volumes" Feb 02 11:03:34 crc kubenswrapper[4845]: I0202 11:03:34.046410 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g8b4r"] Feb 02 11:03:34 crc kubenswrapper[4845]: I0202 11:03:34.057306 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g8b4r"] Feb 02 11:03:35 crc kubenswrapper[4845]: I0202 11:03:35.731300 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183b0ef9-490f-43a1-a464-2bd64a820ebd" path="/var/lib/kubelet/pods/183b0ef9-490f-43a1-a464-2bd64a820ebd/volumes" Feb 02 11:04:09 crc kubenswrapper[4845]: I0202 11:04:09.462459 4845 scope.go:117] "RemoveContainer" containerID="b6c0a29825672ec889a1c4e9480e6e2959d05e730a7944a0c2ac39bff41e3be4" Feb 02 11:04:09 crc kubenswrapper[4845]: I0202 11:04:09.508159 4845 scope.go:117] "RemoveContainer" containerID="bf4552b15f381b58f4ac832ee06ab76b9eb11fbeaace10aae25c2f5b9bfbac69" Feb 02 11:04:09 crc kubenswrapper[4845]: I0202 11:04:09.574718 4845 scope.go:117] "RemoveContainer" containerID="1ae234554f1bf061b9f12986d072427a2f50a26aa3b60b9c9cf12a5ccc0e8cce" Feb 02 11:04:32 crc kubenswrapper[4845]: I0202 11:04:32.041806 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8fed-account-create-update-p76r4"] Feb 02 11:04:32 crc kubenswrapper[4845]: I0202 11:04:32.052875 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8fed-account-create-update-p76r4"] Feb 02 11:04:32 crc kubenswrapper[4845]: I0202 11:04:32.076282 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-p6lbd"] Feb 02 11:04:32 crc kubenswrapper[4845]: I0202 11:04:32.088138 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-p6lbd"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.026922 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vkhkq"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.038056 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-69t2n"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.050212 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vkhkq"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.062440 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4ab2-account-create-update-l992n"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.086582 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-69t2n"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.100453 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4ab2-account-create-update-l992n"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.111870 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-dfd5-account-create-update-9mdwn"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.121967 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-dfd5-account-create-update-9mdwn"] Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.727055 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a7f3c4-2a4a-4d07-91ee-27a63961c272" path="/var/lib/kubelet/pods/08a7f3c4-2a4a-4d07-91ee-27a63961c272/volumes" Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.728345 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b758e3-acc2-451a-b64d-9c53a7e5f98f" path="/var/lib/kubelet/pods/36b758e3-acc2-451a-b64d-9c53a7e5f98f/volumes" Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.729223 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621aa5b7-f496-48f4-a72d-74e8886f813e" path="/var/lib/kubelet/pods/621aa5b7-f496-48f4-a72d-74e8886f813e/volumes" Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.730031 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65acb40f-b003-4d37-93c0-4198beba28ed" path="/var/lib/kubelet/pods/65acb40f-b003-4d37-93c0-4198beba28ed/volumes" Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.731621 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93e02369-64e6-46f8-a84d-f50396230784" path="/var/lib/kubelet/pods/93e02369-64e6-46f8-a84d-f50396230784/volumes" Feb 02 11:04:33 crc kubenswrapper[4845]: I0202 11:04:33.734050 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9ca8c7e-f45d-4014-9599-2ba08495811f" path="/var/lib/kubelet/pods/b9ca8c7e-f45d-4014-9599-2ba08495811f/volumes" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.360612 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:40 crc kubenswrapper[4845]: E0202 11:04:40.361789 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7303667b-89bb-4ad1-92a8-3c94525911d4" containerName="keystone-cron" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.361806 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7303667b-89bb-4ad1-92a8-3c94525911d4" containerName="keystone-cron" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.362071 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7303667b-89bb-4ad1-92a8-3c94525911d4" containerName="keystone-cron" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.364497 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.376240 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.497061 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.497261 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ttzc\" (UniqueName: \"kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.497491 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.599154 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.599492 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ttzc\" (UniqueName: \"kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.599657 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.599841 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.600174 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.623678 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ttzc\" (UniqueName: \"kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc\") pod \"redhat-marketplace-g2rm9\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:40 crc kubenswrapper[4845]: I0202 11:04:40.711039 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:41 crc kubenswrapper[4845]: I0202 11:04:41.222103 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:41 crc kubenswrapper[4845]: I0202 11:04:41.258211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerStarted","Data":"83963d09faf5438bbb2bea1fea716dfae030b316cca0108f56c9404d7d2ecfc4"} Feb 02 11:04:42 crc kubenswrapper[4845]: I0202 11:04:42.282836 4845 generic.go:334] "Generic (PLEG): container finished" podID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerID="4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220" exitCode=0 Feb 02 11:04:42 crc kubenswrapper[4845]: I0202 11:04:42.283048 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerDied","Data":"4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220"} Feb 02 11:04:42 crc kubenswrapper[4845]: I0202 11:04:42.285744 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.294501 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerStarted","Data":"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c"} Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.550685 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.553117 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.562730 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.569481 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.569607 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmthk\" (UniqueName: \"kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.569811 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.673514 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.673684 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmthk\" (UniqueName: \"kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.673908 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.674225 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.674627 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.706765 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmthk\" (UniqueName: \"kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk\") pod \"certified-operators-8c6ts\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:43 crc kubenswrapper[4845]: I0202 11:04:43.870803 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:44 crc kubenswrapper[4845]: I0202 11:04:44.307967 4845 generic.go:334] "Generic (PLEG): container finished" podID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerID="58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c" exitCode=0 Feb 02 11:04:44 crc kubenswrapper[4845]: I0202 11:04:44.308028 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerDied","Data":"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c"} Feb 02 11:04:44 crc kubenswrapper[4845]: I0202 11:04:44.410448 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:45 crc kubenswrapper[4845]: I0202 11:04:45.324204 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerID="26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79" exitCode=0 Feb 02 11:04:45 crc kubenswrapper[4845]: I0202 11:04:45.324571 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerDied","Data":"26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79"} Feb 02 11:04:45 crc kubenswrapper[4845]: I0202 11:04:45.324652 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerStarted","Data":"b1870972ad32591386243137d4d6406259a3aed6d72d8c02868ba64bfdba894c"} Feb 02 11:04:45 crc kubenswrapper[4845]: I0202 11:04:45.329671 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerStarted","Data":"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb"} Feb 02 11:04:45 crc kubenswrapper[4845]: I0202 11:04:45.395877 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g2rm9" podStartSLOduration=2.956421893 podStartE2EDuration="5.395853667s" podCreationTimestamp="2026-02-02 11:04:40 +0000 UTC" firstStartedPulling="2026-02-02 11:04:42.28553992 +0000 UTC m=+1963.376941370" lastFinishedPulling="2026-02-02 11:04:44.724971694 +0000 UTC m=+1965.816373144" observedRunningTime="2026-02-02 11:04:45.371773497 +0000 UTC m=+1966.463174957" watchObservedRunningTime="2026-02-02 11:04:45.395853667 +0000 UTC m=+1966.487255117" Feb 02 11:04:46 crc kubenswrapper[4845]: I0202 11:04:46.237741 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:04:46 crc kubenswrapper[4845]: I0202 11:04:46.238063 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:04:47 crc kubenswrapper[4845]: I0202 11:04:47.354401 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerStarted","Data":"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03"} Feb 02 11:04:48 crc kubenswrapper[4845]: I0202 11:04:48.371720 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerID="80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03" exitCode=0 Feb 02 11:04:48 crc kubenswrapper[4845]: I0202 11:04:48.371781 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerDied","Data":"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03"} Feb 02 11:04:49 crc kubenswrapper[4845]: I0202 11:04:49.387725 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerStarted","Data":"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae"} Feb 02 11:04:49 crc kubenswrapper[4845]: I0202 11:04:49.415494 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8c6ts" podStartSLOduration=2.68541841 podStartE2EDuration="6.415473244s" podCreationTimestamp="2026-02-02 11:04:43 +0000 UTC" firstStartedPulling="2026-02-02 11:04:45.328179589 +0000 UTC m=+1966.419581039" lastFinishedPulling="2026-02-02 11:04:49.058234433 +0000 UTC m=+1970.149635873" observedRunningTime="2026-02-02 11:04:49.41041358 +0000 UTC m=+1970.501815040" watchObservedRunningTime="2026-02-02 11:04:49.415473244 +0000 UTC m=+1970.506874694" Feb 02 11:04:50 crc kubenswrapper[4845]: I0202 11:04:50.711670 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:50 crc kubenswrapper[4845]: I0202 11:04:50.711845 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:50 crc kubenswrapper[4845]: I0202 11:04:50.772588 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:51 crc kubenswrapper[4845]: I0202 11:04:51.464357 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:52 crc kubenswrapper[4845]: I0202 11:04:52.343304 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:53 crc kubenswrapper[4845]: I0202 11:04:53.427791 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g2rm9" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="registry-server" containerID="cri-o://5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb" gracePeriod=2 Feb 02 11:04:53 crc kubenswrapper[4845]: I0202 11:04:53.871131 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:53 crc kubenswrapper[4845]: I0202 11:04:53.871512 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:53 crc kubenswrapper[4845]: I0202 11:04:53.932899 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.003419 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.064471 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0d03-account-create-update-79bpm"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.080525 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-mjh66"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.092403 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-mjh66"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.102962 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0d03-account-create-update-79bpm"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.178381 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content\") pod \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.178566 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ttzc\" (UniqueName: \"kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc\") pod \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.178740 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities\") pod \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\" (UID: \"a5d969d0-588a-489d-a3b7-936e8e2f0c4e\") " Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.180242 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities" (OuterVolumeSpecName: "utilities") pod "a5d969d0-588a-489d-a3b7-936e8e2f0c4e" (UID: "a5d969d0-588a-489d-a3b7-936e8e2f0c4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.186543 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc" (OuterVolumeSpecName: "kube-api-access-6ttzc") pod "a5d969d0-588a-489d-a3b7-936e8e2f0c4e" (UID: "a5d969d0-588a-489d-a3b7-936e8e2f0c4e"). InnerVolumeSpecName "kube-api-access-6ttzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.206494 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5d969d0-588a-489d-a3b7-936e8e2f0c4e" (UID: "a5d969d0-588a-489d-a3b7-936e8e2f0c4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.284011 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.284067 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.284083 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ttzc\" (UniqueName: \"kubernetes.io/projected/a5d969d0-588a-489d-a3b7-936e8e2f0c4e-kube-api-access-6ttzc\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.445852 4845 generic.go:334] "Generic (PLEG): container finished" podID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerID="5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb" exitCode=0 Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.445973 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerDied","Data":"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb"} Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.446103 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g2rm9" event={"ID":"a5d969d0-588a-489d-a3b7-936e8e2f0c4e","Type":"ContainerDied","Data":"83963d09faf5438bbb2bea1fea716dfae030b316cca0108f56c9404d7d2ecfc4"} Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.446145 4845 scope.go:117] "RemoveContainer" containerID="5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.446143 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g2rm9" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.494345 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.497247 4845 scope.go:117] "RemoveContainer" containerID="58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.513289 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g2rm9"] Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.521227 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.531931 4845 scope.go:117] "RemoveContainer" containerID="4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.607153 4845 scope.go:117] "RemoveContainer" containerID="5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb" Feb 02 11:04:54 crc kubenswrapper[4845]: E0202 11:04:54.607787 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb\": container with ID starting with 5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb not found: ID does not exist" containerID="5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.607835 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb"} err="failed to get container status \"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb\": rpc error: code = NotFound desc = could not find container \"5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb\": container with ID starting with 5614282f6fcb67b0799762bc9a609e2fe90e0c2c698f6852a2ae79d0cc58c6eb not found: ID does not exist" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.607867 4845 scope.go:117] "RemoveContainer" containerID="58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c" Feb 02 11:04:54 crc kubenswrapper[4845]: E0202 11:04:54.608542 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c\": container with ID starting with 58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c not found: ID does not exist" containerID="58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.608606 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c"} err="failed to get container status \"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c\": rpc error: code = NotFound desc = could not find container \"58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c\": container with ID starting with 58e792a966d78256f900ef7ef7a4255a79fe30a8f09058b484b2d1639fe6d89c not found: ID does not exist" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.608643 4845 scope.go:117] "RemoveContainer" containerID="4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220" Feb 02 11:04:54 crc kubenswrapper[4845]: E0202 11:04:54.609009 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220\": container with ID starting with 4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220 not found: ID does not exist" containerID="4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220" Feb 02 11:04:54 crc kubenswrapper[4845]: I0202 11:04:54.609144 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220"} err="failed to get container status \"4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220\": rpc error: code = NotFound desc = could not find container \"4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220\": container with ID starting with 4b94519ad73f936d86efd08640937b6c1ba7a905b0250910dd3b103f84645220 not found: ID does not exist" Feb 02 11:04:55 crc kubenswrapper[4845]: I0202 11:04:55.747025 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c902530-dc88-4300-9356-1f3938cfef4a" path="/var/lib/kubelet/pods/7c902530-dc88-4300-9356-1f3938cfef4a/volumes" Feb 02 11:04:55 crc kubenswrapper[4845]: I0202 11:04:55.749797 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" path="/var/lib/kubelet/pods/a5d969d0-588a-489d-a3b7-936e8e2f0c4e/volumes" Feb 02 11:04:55 crc kubenswrapper[4845]: I0202 11:04:55.753119 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f45c3661-e66b-41a2-9a98-db215df0b2cf" path="/var/lib/kubelet/pods/f45c3661-e66b-41a2-9a98-db215df0b2cf/volumes" Feb 02 11:04:56 crc kubenswrapper[4845]: I0202 11:04:56.344421 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:56 crc kubenswrapper[4845]: I0202 11:04:56.482641 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8c6ts" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="registry-server" containerID="cri-o://3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae" gracePeriod=2 Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.015941 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.064819 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content\") pod \"ae5f408f-28d9-4652-a265-c49fa34ab604\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.065119 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities\") pod \"ae5f408f-28d9-4652-a265-c49fa34ab604\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.065202 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmthk\" (UniqueName: \"kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk\") pod \"ae5f408f-28d9-4652-a265-c49fa34ab604\" (UID: \"ae5f408f-28d9-4652-a265-c49fa34ab604\") " Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.066875 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities" (OuterVolumeSpecName: "utilities") pod "ae5f408f-28d9-4652-a265-c49fa34ab604" (UID: "ae5f408f-28d9-4652-a265-c49fa34ab604"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.075376 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk" (OuterVolumeSpecName: "kube-api-access-mmthk") pod "ae5f408f-28d9-4652-a265-c49fa34ab604" (UID: "ae5f408f-28d9-4652-a265-c49fa34ab604"). InnerVolumeSpecName "kube-api-access-mmthk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.131226 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae5f408f-28d9-4652-a265-c49fa34ab604" (UID: "ae5f408f-28d9-4652-a265-c49fa34ab604"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.169847 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.169920 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae5f408f-28d9-4652-a265-c49fa34ab604-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.169932 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmthk\" (UniqueName: \"kubernetes.io/projected/ae5f408f-28d9-4652-a265-c49fa34ab604-kube-api-access-mmthk\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.495674 4845 generic.go:334] "Generic (PLEG): container finished" podID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerID="3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae" exitCode=0 Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.496045 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerDied","Data":"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae"} Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.496097 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8c6ts" event={"ID":"ae5f408f-28d9-4652-a265-c49fa34ab604","Type":"ContainerDied","Data":"b1870972ad32591386243137d4d6406259a3aed6d72d8c02868ba64bfdba894c"} Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.496105 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8c6ts" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.496120 4845 scope.go:117] "RemoveContainer" containerID="3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.522228 4845 scope.go:117] "RemoveContainer" containerID="80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.549829 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.562237 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8c6ts"] Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.568110 4845 scope.go:117] "RemoveContainer" containerID="26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.619251 4845 scope.go:117] "RemoveContainer" containerID="3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae" Feb 02 11:04:57 crc kubenswrapper[4845]: E0202 11:04:57.619973 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae\": container with ID starting with 3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae not found: ID does not exist" containerID="3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.620010 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae"} err="failed to get container status \"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae\": rpc error: code = NotFound desc = could not find container \"3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae\": container with ID starting with 3ac8acdbdb10a2f21e5876abcb2d3667bfaec600f7a206f9b9e03ca620f48dae not found: ID does not exist" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.620037 4845 scope.go:117] "RemoveContainer" containerID="80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03" Feb 02 11:04:57 crc kubenswrapper[4845]: E0202 11:04:57.620562 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03\": container with ID starting with 80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03 not found: ID does not exist" containerID="80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.620640 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03"} err="failed to get container status \"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03\": rpc error: code = NotFound desc = could not find container \"80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03\": container with ID starting with 80e7346a150d6048bdbfda0a75a9f91d29b4a7dfc459101707ba10e54a0fac03 not found: ID does not exist" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.620694 4845 scope.go:117] "RemoveContainer" containerID="26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79" Feb 02 11:04:57 crc kubenswrapper[4845]: E0202 11:04:57.621126 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79\": container with ID starting with 26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79 not found: ID does not exist" containerID="26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.621157 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79"} err="failed to get container status \"26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79\": rpc error: code = NotFound desc = could not find container \"26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79\": container with ID starting with 26d9f500c910a918a9b073b63ed2836fd48901c0e0db17c8b0223fb520c3df79 not found: ID does not exist" Feb 02 11:04:57 crc kubenswrapper[4845]: I0202 11:04:57.727819 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" path="/var/lib/kubelet/pods/ae5f408f-28d9-4652-a265-c49fa34ab604/volumes" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.705254 4845 scope.go:117] "RemoveContainer" containerID="7bd7e7f5964f9f073fd42624c4ca749f932b5e04b1db26f0a1d5d0f87ecbdb1f" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.729842 4845 scope.go:117] "RemoveContainer" containerID="b7177510879626b8e93e3bec07d3378fb19c2869cf116cbd055fdaad370ef7da" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.798625 4845 scope.go:117] "RemoveContainer" containerID="74d2907667c2c914fc57daa5fed9146112b93f8bf9f401e92958c5811e8dc6f3" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.861515 4845 scope.go:117] "RemoveContainer" containerID="6019d47869688d15147a8932c0b84dab21fa14445dd8f012a8b145c6bed8a74d" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.922846 4845 scope.go:117] "RemoveContainer" containerID="8ac663309bf94373ab6aee2bd864d56af41199dceccdc883fa0aff4b2aa502f8" Feb 02 11:05:09 crc kubenswrapper[4845]: I0202 11:05:09.985811 4845 scope.go:117] "RemoveContainer" containerID="45867aad16ddef2c9f5a1df92c6f8cab6d7f01caa36027be8a34aecad2ed798b" Feb 02 11:05:10 crc kubenswrapper[4845]: I0202 11:05:10.037311 4845 scope.go:117] "RemoveContainer" containerID="d9f965fd9f9dbd88dc973b197a2f3c57c9e0f963db241ff90038d5e96106751f" Feb 02 11:05:10 crc kubenswrapper[4845]: I0202 11:05:10.061350 4845 scope.go:117] "RemoveContainer" containerID="8360a8bbf698c5c922fa9756941c2a4df903063c7069ac543b4e17f4e1f546d5" Feb 02 11:05:16 crc kubenswrapper[4845]: I0202 11:05:16.061086 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qwpzq"] Feb 02 11:05:16 crc kubenswrapper[4845]: I0202 11:05:16.073384 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qwpzq"] Feb 02 11:05:16 crc kubenswrapper[4845]: I0202 11:05:16.238282 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:05:16 crc kubenswrapper[4845]: I0202 11:05:16.238381 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:05:17 crc kubenswrapper[4845]: I0202 11:05:17.726550 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff" path="/var/lib/kubelet/pods/cbc0d18f-cd37-49cf-8aa1-b62f1cf533ff/volumes" Feb 02 11:05:40 crc kubenswrapper[4845]: I0202 11:05:40.065798 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-725jn"] Feb 02 11:05:40 crc kubenswrapper[4845]: I0202 11:05:40.083766 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-725jn"] Feb 02 11:05:41 crc kubenswrapper[4845]: I0202 11:05:41.727381 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7439e987-75e8-4cc8-840a-742c6f07dea9" path="/var/lib/kubelet/pods/7439e987-75e8-4cc8-840a-742c6f07dea9/volumes" Feb 02 11:05:42 crc kubenswrapper[4845]: I0202 11:05:42.038072 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r4lj7"] Feb 02 11:05:42 crc kubenswrapper[4845]: I0202 11:05:42.054864 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-r4lj7"] Feb 02 11:05:43 crc kubenswrapper[4845]: I0202 11:05:43.727133 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b2bad3a-8153-41d8-83f6-9f9caa16589b" path="/var/lib/kubelet/pods/7b2bad3a-8153-41d8-83f6-9f9caa16589b/volumes" Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.237502 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.237827 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.237876 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.238901 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.238981 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e" gracePeriod=600 Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.993803 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e" exitCode=0 Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.993910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e"} Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.994456 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc"} Feb 02 11:05:46 crc kubenswrapper[4845]: I0202 11:05:46.994500 4845 scope.go:117] "RemoveContainer" containerID="d11ef40d44057c492efbc2859a87d8dd29e6dc58cef9dafd0bcece48331b0020" Feb 02 11:06:10 crc kubenswrapper[4845]: I0202 11:06:10.285352 4845 scope.go:117] "RemoveContainer" containerID="4bd5dbe3f7a8b7903c6ee652f72c588239fb784e24ebb2f47f5ca3b9452668fa" Feb 02 11:06:10 crc kubenswrapper[4845]: I0202 11:06:10.351927 4845 scope.go:117] "RemoveContainer" containerID="7ebfe4b502f94b649860108409e8e9b93586764606d6f176206def30cf86a61d" Feb 02 11:06:10 crc kubenswrapper[4845]: I0202 11:06:10.426404 4845 scope.go:117] "RemoveContainer" containerID="f926f2a33b190bee678900de2f6fcfec869f9b6aacd707a0eedb2404459e01e7" Feb 02 11:06:26 crc kubenswrapper[4845]: I0202 11:06:26.048008 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5zwdv"] Feb 02 11:06:26 crc kubenswrapper[4845]: I0202 11:06:26.059932 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5zwdv"] Feb 02 11:06:27 crc kubenswrapper[4845]: I0202 11:06:27.726139 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b" path="/var/lib/kubelet/pods/2b73ddcb-6457-4dd4-9d61-7ede43d6ea4b/volumes" Feb 02 11:07:10 crc kubenswrapper[4845]: I0202 11:07:10.582357 4845 scope.go:117] "RemoveContainer" containerID="8584a98006ccb553e74a988ef9d575c49271ebfc304ac836fbd4745ff8e13b8d" Feb 02 11:07:46 crc kubenswrapper[4845]: I0202 11:07:46.237774 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:07:46 crc kubenswrapper[4845]: I0202 11:07:46.238355 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.892110 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893056 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893069 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893082 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="extract-utilities" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893088 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="extract-utilities" Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893102 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="extract-content" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893110 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="extract-content" Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893123 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="extract-utilities" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893130 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="extract-utilities" Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893151 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893157 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: E0202 11:08:09.893170 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="extract-content" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893175 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="extract-content" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893420 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5f408f-28d9-4652-a265-c49fa34ab604" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.893442 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d969d0-588a-489d-a3b7-936e8e2f0c4e" containerName="registry-server" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.895054 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:09 crc kubenswrapper[4845]: I0202 11:08:09.912957 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.041273 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.041508 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbwq\" (UniqueName: \"kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.041977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.145036 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.145132 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbwq\" (UniqueName: \"kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.145310 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.145568 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.145955 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.168053 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbwq\" (UniqueName: \"kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq\") pod \"community-operators-qqwd7\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.218569 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:10 crc kubenswrapper[4845]: I0202 11:08:10.873163 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:11 crc kubenswrapper[4845]: I0202 11:08:11.588422 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerID="0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0" exitCode=0 Feb 02 11:08:11 crc kubenswrapper[4845]: I0202 11:08:11.588506 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerDied","Data":"0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0"} Feb 02 11:08:11 crc kubenswrapper[4845]: I0202 11:08:11.588767 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerStarted","Data":"4e21bfde1a1ee3b49ab06588e29302cbcb090c8fd115755fa8b2ad925a6ae8aa"} Feb 02 11:08:12 crc kubenswrapper[4845]: I0202 11:08:12.600487 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerStarted","Data":"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814"} Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.078526 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.081171 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.100719 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.235231 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.235305 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.235473 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwvj\" (UniqueName: \"kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.337835 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwvj\" (UniqueName: \"kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.338172 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.338218 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.338680 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.338788 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.361392 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwvj\" (UniqueName: \"kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj\") pod \"redhat-operators-vzkq6\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.399687 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.629203 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerID="24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814" exitCode=0 Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.629355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerDied","Data":"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814"} Feb 02 11:08:13 crc kubenswrapper[4845]: I0202 11:08:13.960350 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:14 crc kubenswrapper[4845]: I0202 11:08:14.641575 4845 generic.go:334] "Generic (PLEG): container finished" podID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerID="4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486" exitCode=0 Feb 02 11:08:14 crc kubenswrapper[4845]: I0202 11:08:14.641675 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerDied","Data":"4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486"} Feb 02 11:08:14 crc kubenswrapper[4845]: I0202 11:08:14.642217 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerStarted","Data":"498eff50f0467701d7486f7e5e6cf9b36286f8e1d73a699ceb16df7b8ad64222"} Feb 02 11:08:14 crc kubenswrapper[4845]: I0202 11:08:14.648199 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerStarted","Data":"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4"} Feb 02 11:08:14 crc kubenswrapper[4845]: I0202 11:08:14.700711 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qqwd7" podStartSLOduration=3.180613261 podStartE2EDuration="5.700665572s" podCreationTimestamp="2026-02-02 11:08:09 +0000 UTC" firstStartedPulling="2026-02-02 11:08:11.591379005 +0000 UTC m=+2172.682780455" lastFinishedPulling="2026-02-02 11:08:14.111431326 +0000 UTC m=+2175.202832766" observedRunningTime="2026-02-02 11:08:14.682044148 +0000 UTC m=+2175.773445598" watchObservedRunningTime="2026-02-02 11:08:14.700665572 +0000 UTC m=+2175.792067022" Feb 02 11:08:15 crc kubenswrapper[4845]: I0202 11:08:15.678528 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerStarted","Data":"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6"} Feb 02 11:08:16 crc kubenswrapper[4845]: I0202 11:08:16.237632 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:08:16 crc kubenswrapper[4845]: I0202 11:08:16.237695 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:08:18 crc kubenswrapper[4845]: I0202 11:08:18.711604 4845 generic.go:334] "Generic (PLEG): container finished" podID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerID="1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6" exitCode=0 Feb 02 11:08:18 crc kubenswrapper[4845]: I0202 11:08:18.711683 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerDied","Data":"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6"} Feb 02 11:08:19 crc kubenswrapper[4845]: I0202 11:08:19.728990 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerStarted","Data":"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6"} Feb 02 11:08:19 crc kubenswrapper[4845]: I0202 11:08:19.767148 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vzkq6" podStartSLOduration=2.217484662 podStartE2EDuration="6.767131519s" podCreationTimestamp="2026-02-02 11:08:13 +0000 UTC" firstStartedPulling="2026-02-02 11:08:14.644442101 +0000 UTC m=+2175.735843551" lastFinishedPulling="2026-02-02 11:08:19.194088958 +0000 UTC m=+2180.285490408" observedRunningTime="2026-02-02 11:08:19.748601919 +0000 UTC m=+2180.840003369" watchObservedRunningTime="2026-02-02 11:08:19.767131519 +0000 UTC m=+2180.858532969" Feb 02 11:08:20 crc kubenswrapper[4845]: I0202 11:08:20.220220 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:20 crc kubenswrapper[4845]: I0202 11:08:20.220627 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:20 crc kubenswrapper[4845]: I0202 11:08:20.283959 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:20 crc kubenswrapper[4845]: I0202 11:08:20.802664 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:21 crc kubenswrapper[4845]: I0202 11:08:21.470641 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:22 crc kubenswrapper[4845]: I0202 11:08:22.765339 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qqwd7" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="registry-server" containerID="cri-o://b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4" gracePeriod=2 Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.298657 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.400910 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.400977 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.411644 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content\") pod \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.411787 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities\") pod \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.411874 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcbwq\" (UniqueName: \"kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq\") pod \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\" (UID: \"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4\") " Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.414646 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities" (OuterVolumeSpecName: "utilities") pod "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" (UID: "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.424480 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq" (OuterVolumeSpecName: "kube-api-access-qcbwq") pod "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" (UID: "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4"). InnerVolumeSpecName "kube-api-access-qcbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.467331 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" (UID: "ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.514729 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.514771 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcbwq\" (UniqueName: \"kubernetes.io/projected/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-kube-api-access-qcbwq\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.514781 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.790514 4845 generic.go:334] "Generic (PLEG): container finished" podID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerID="b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4" exitCode=0 Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.790560 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerDied","Data":"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4"} Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.790587 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qqwd7" event={"ID":"ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4","Type":"ContainerDied","Data":"4e21bfde1a1ee3b49ab06588e29302cbcb090c8fd115755fa8b2ad925a6ae8aa"} Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.790606 4845 scope.go:117] "RemoveContainer" containerID="b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.790730 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qqwd7" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.818701 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.823983 4845 scope.go:117] "RemoveContainer" containerID="24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.828928 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qqwd7"] Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.854419 4845 scope.go:117] "RemoveContainer" containerID="0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.934017 4845 scope.go:117] "RemoveContainer" containerID="b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4" Feb 02 11:08:23 crc kubenswrapper[4845]: E0202 11:08:23.934801 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4\": container with ID starting with b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4 not found: ID does not exist" containerID="b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.934858 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4"} err="failed to get container status \"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4\": rpc error: code = NotFound desc = could not find container \"b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4\": container with ID starting with b4be15d794a32e41d2e3bb12536560ef1eef2d87d720ceb166d9996a9713dfb4 not found: ID does not exist" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.934994 4845 scope.go:117] "RemoveContainer" containerID="24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814" Feb 02 11:08:23 crc kubenswrapper[4845]: E0202 11:08:23.935495 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814\": container with ID starting with 24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814 not found: ID does not exist" containerID="24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.935530 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814"} err="failed to get container status \"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814\": rpc error: code = NotFound desc = could not find container \"24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814\": container with ID starting with 24974c786c52f7413944db868bbd3565db04e2c49ae3260312b96556740b6814 not found: ID does not exist" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.935554 4845 scope.go:117] "RemoveContainer" containerID="0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0" Feb 02 11:08:23 crc kubenswrapper[4845]: E0202 11:08:23.935916 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0\": container with ID starting with 0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0 not found: ID does not exist" containerID="0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0" Feb 02 11:08:23 crc kubenswrapper[4845]: I0202 11:08:23.935937 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0"} err="failed to get container status \"0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0\": rpc error: code = NotFound desc = could not find container \"0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0\": container with ID starting with 0a23edc901a67b4f93b00fd9d012f9a5dd5b1c900604091e293e58e20c4e3ee0 not found: ID does not exist" Feb 02 11:08:24 crc kubenswrapper[4845]: I0202 11:08:24.452792 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vzkq6" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="registry-server" probeResult="failure" output=< Feb 02 11:08:24 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:08:24 crc kubenswrapper[4845]: > Feb 02 11:08:25 crc kubenswrapper[4845]: I0202 11:08:25.727681 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" path="/var/lib/kubelet/pods/ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4/volumes" Feb 02 11:08:33 crc kubenswrapper[4845]: I0202 11:08:33.458749 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:33 crc kubenswrapper[4845]: I0202 11:08:33.526428 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:33 crc kubenswrapper[4845]: I0202 11:08:33.702614 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:34 crc kubenswrapper[4845]: I0202 11:08:34.939300 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vzkq6" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="registry-server" containerID="cri-o://636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6" gracePeriod=2 Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.469273 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.617364 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrwvj\" (UniqueName: \"kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj\") pod \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.617833 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content\") pod \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.617877 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities\") pod \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\" (UID: \"df0bb4b2-0075-4535-83d7-ff2a511bcfc4\") " Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.618814 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities" (OuterVolumeSpecName: "utilities") pod "df0bb4b2-0075-4535-83d7-ff2a511bcfc4" (UID: "df0bb4b2-0075-4535-83d7-ff2a511bcfc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.619157 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.625218 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj" (OuterVolumeSpecName: "kube-api-access-jrwvj") pod "df0bb4b2-0075-4535-83d7-ff2a511bcfc4" (UID: "df0bb4b2-0075-4535-83d7-ff2a511bcfc4"). InnerVolumeSpecName "kube-api-access-jrwvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.722595 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrwvj\" (UniqueName: \"kubernetes.io/projected/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-kube-api-access-jrwvj\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.764199 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df0bb4b2-0075-4535-83d7-ff2a511bcfc4" (UID: "df0bb4b2-0075-4535-83d7-ff2a511bcfc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.824951 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0bb4b2-0075-4535-83d7-ff2a511bcfc4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.952285 4845 generic.go:334] "Generic (PLEG): container finished" podID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerID="636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6" exitCode=0 Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.952363 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vzkq6" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.952378 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerDied","Data":"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6"} Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.953651 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vzkq6" event={"ID":"df0bb4b2-0075-4535-83d7-ff2a511bcfc4","Type":"ContainerDied","Data":"498eff50f0467701d7486f7e5e6cf9b36286f8e1d73a699ceb16df7b8ad64222"} Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.953675 4845 scope.go:117] "RemoveContainer" containerID="636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6" Feb 02 11:08:35 crc kubenswrapper[4845]: I0202 11:08:35.974620 4845 scope.go:117] "RemoveContainer" containerID="1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.003517 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.016404 4845 scope.go:117] "RemoveContainer" containerID="4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.021610 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vzkq6"] Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.066475 4845 scope.go:117] "RemoveContainer" containerID="636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6" Feb 02 11:08:36 crc kubenswrapper[4845]: E0202 11:08:36.067100 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6\": container with ID starting with 636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6 not found: ID does not exist" containerID="636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.067232 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6"} err="failed to get container status \"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6\": rpc error: code = NotFound desc = could not find container \"636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6\": container with ID starting with 636ab465a8078450ddb10fd3aae1702e75d04808550e3cd60e630da84b8ef3c6 not found: ID does not exist" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.067318 4845 scope.go:117] "RemoveContainer" containerID="1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6" Feb 02 11:08:36 crc kubenswrapper[4845]: E0202 11:08:36.067655 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6\": container with ID starting with 1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6 not found: ID does not exist" containerID="1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.067751 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6"} err="failed to get container status \"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6\": rpc error: code = NotFound desc = could not find container \"1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6\": container with ID starting with 1cceb1a8e5ffabf74852f80059be6adcba7a287af9c9ef9744104fb6b4a8faf6 not found: ID does not exist" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.067843 4845 scope.go:117] "RemoveContainer" containerID="4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486" Feb 02 11:08:36 crc kubenswrapper[4845]: E0202 11:08:36.068331 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486\": container with ID starting with 4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486 not found: ID does not exist" containerID="4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486" Feb 02 11:08:36 crc kubenswrapper[4845]: I0202 11:08:36.068429 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486"} err="failed to get container status \"4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486\": rpc error: code = NotFound desc = could not find container \"4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486\": container with ID starting with 4687b265863fa3fb6587f2d9ada8ed78a0b6527a64946cba7a527aa7c0a55486 not found: ID does not exist" Feb 02 11:08:37 crc kubenswrapper[4845]: I0202 11:08:37.728859 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" path="/var/lib/kubelet/pods/df0bb4b2-0075-4535-83d7-ff2a511bcfc4/volumes" Feb 02 11:08:46 crc kubenswrapper[4845]: I0202 11:08:46.237151 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:08:46 crc kubenswrapper[4845]: I0202 11:08:46.237672 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:08:46 crc kubenswrapper[4845]: I0202 11:08:46.237715 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:08:46 crc kubenswrapper[4845]: I0202 11:08:46.238674 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:08:46 crc kubenswrapper[4845]: I0202 11:08:46.238728 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" gracePeriod=600 Feb 02 11:08:46 crc kubenswrapper[4845]: E0202 11:08:46.367074 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:08:47 crc kubenswrapper[4845]: I0202 11:08:47.075675 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" exitCode=0 Feb 02 11:08:47 crc kubenswrapper[4845]: I0202 11:08:47.075730 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc"} Feb 02 11:08:47 crc kubenswrapper[4845]: I0202 11:08:47.075770 4845 scope.go:117] "RemoveContainer" containerID="0b390714f50dcbc5e75002b4a0473fa9805c797be6ccee1c185ac005a97bc29e" Feb 02 11:08:47 crc kubenswrapper[4845]: I0202 11:08:47.076967 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:08:47 crc kubenswrapper[4845]: E0202 11:08:47.077464 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:08:58 crc kubenswrapper[4845]: I0202 11:08:58.713266 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:08:58 crc kubenswrapper[4845]: E0202 11:08:58.714130 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:09:12 crc kubenswrapper[4845]: I0202 11:09:12.713079 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:09:12 crc kubenswrapper[4845]: E0202 11:09:12.714207 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:09:25 crc kubenswrapper[4845]: I0202 11:09:25.712665 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:09:25 crc kubenswrapper[4845]: E0202 11:09:25.713425 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:09:40 crc kubenswrapper[4845]: I0202 11:09:40.713123 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:09:40 crc kubenswrapper[4845]: E0202 11:09:40.713924 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:09:52 crc kubenswrapper[4845]: I0202 11:09:52.713635 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:09:52 crc kubenswrapper[4845]: E0202 11:09:52.714429 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:10:03 crc kubenswrapper[4845]: I0202 11:10:03.713794 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:10:03 crc kubenswrapper[4845]: E0202 11:10:03.715376 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:10:17 crc kubenswrapper[4845]: I0202 11:10:17.713367 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:10:17 crc kubenswrapper[4845]: E0202 11:10:17.714712 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:10:29 crc kubenswrapper[4845]: I0202 11:10:29.718763 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:10:29 crc kubenswrapper[4845]: E0202 11:10:29.719481 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:10:43 crc kubenswrapper[4845]: I0202 11:10:43.713500 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:10:43 crc kubenswrapper[4845]: E0202 11:10:43.714324 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:10:54 crc kubenswrapper[4845]: I0202 11:10:54.713524 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:10:54 crc kubenswrapper[4845]: E0202 11:10:54.714383 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:11:09 crc kubenswrapper[4845]: I0202 11:11:09.721124 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:11:09 crc kubenswrapper[4845]: E0202 11:11:09.722168 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:11:20 crc kubenswrapper[4845]: I0202 11:11:20.714776 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:11:20 crc kubenswrapper[4845]: E0202 11:11:20.716744 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:11:32 crc kubenswrapper[4845]: I0202 11:11:32.712834 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:11:32 crc kubenswrapper[4845]: E0202 11:11:32.713730 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:11:46 crc kubenswrapper[4845]: I0202 11:11:46.712583 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:11:46 crc kubenswrapper[4845]: E0202 11:11:46.713364 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:11:59 crc kubenswrapper[4845]: I0202 11:11:59.723368 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:11:59 crc kubenswrapper[4845]: E0202 11:11:59.734104 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:12:11 crc kubenswrapper[4845]: I0202 11:12:11.712866 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:12:11 crc kubenswrapper[4845]: E0202 11:12:11.713939 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:12:26 crc kubenswrapper[4845]: I0202 11:12:26.713328 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:12:26 crc kubenswrapper[4845]: E0202 11:12:26.714311 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:12:40 crc kubenswrapper[4845]: I0202 11:12:40.713089 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:12:40 crc kubenswrapper[4845]: E0202 11:12:40.713846 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:12:51 crc kubenswrapper[4845]: I0202 11:12:51.712845 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:12:51 crc kubenswrapper[4845]: E0202 11:12:51.713574 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:13:02 crc kubenswrapper[4845]: I0202 11:13:02.712520 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:13:02 crc kubenswrapper[4845]: E0202 11:13:02.713229 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:13:13 crc kubenswrapper[4845]: I0202 11:13:13.712641 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:13:13 crc kubenswrapper[4845]: E0202 11:13:13.713587 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:13:27 crc kubenswrapper[4845]: I0202 11:13:27.712344 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:13:27 crc kubenswrapper[4845]: E0202 11:13:27.713335 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:13:38 crc kubenswrapper[4845]: I0202 11:13:38.714021 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:13:38 crc kubenswrapper[4845]: E0202 11:13:38.715394 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:13:52 crc kubenswrapper[4845]: I0202 11:13:52.714053 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:13:53 crc kubenswrapper[4845]: I0202 11:13:53.247639 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033"} Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.154024 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn"] Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165001 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165047 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165075 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165084 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165128 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165138 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165173 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165182 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165208 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165216 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4845]: E0202 11:15:00.165248 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.165256 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.166035 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3c8b0c-a23c-4e80-b25d-b35e1ee487b4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.166073 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0bb4b2-0075-4535-83d7-ff2a511bcfc4" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.167997 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.175903 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.177037 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.183209 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn"] Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.301452 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.301526 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.301607 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfgm2\" (UniqueName: \"kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.407251 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfgm2\" (UniqueName: \"kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.407485 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.407521 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.408702 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.415048 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.430287 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfgm2\" (UniqueName: \"kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2\") pod \"collect-profiles-29500515-lbbxn\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.505830 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:00 crc kubenswrapper[4845]: I0202 11:15:00.976797 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn"] Feb 02 11:15:01 crc kubenswrapper[4845]: I0202 11:15:01.926743 4845 generic.go:334] "Generic (PLEG): container finished" podID="dfe7b56f-4954-457d-8bb8-a0a50096cfb9" containerID="7f56eeae3b5e853cc5e5d1ab4c3fe6f56e0c913955d2cd95163f2033cd7e1417" exitCode=0 Feb 02 11:15:01 crc kubenswrapper[4845]: I0202 11:15:01.926814 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" event={"ID":"dfe7b56f-4954-457d-8bb8-a0a50096cfb9","Type":"ContainerDied","Data":"7f56eeae3b5e853cc5e5d1ab4c3fe6f56e0c913955d2cd95163f2033cd7e1417"} Feb 02 11:15:01 crc kubenswrapper[4845]: I0202 11:15:01.927150 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" event={"ID":"dfe7b56f-4954-457d-8bb8-a0a50096cfb9","Type":"ContainerStarted","Data":"9ec417b0fb92ea7a169c394bf23e5e292e5d100d941622f9d3876ee0cfbd387d"} Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.450634 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.553053 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume\") pod \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.553219 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfgm2\" (UniqueName: \"kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2\") pod \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.553289 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume\") pod \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\" (UID: \"dfe7b56f-4954-457d-8bb8-a0a50096cfb9\") " Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.553829 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfe7b56f-4954-457d-8bb8-a0a50096cfb9" (UID: "dfe7b56f-4954-457d-8bb8-a0a50096cfb9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.554202 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.562656 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2" (OuterVolumeSpecName: "kube-api-access-qfgm2") pod "dfe7b56f-4954-457d-8bb8-a0a50096cfb9" (UID: "dfe7b56f-4954-457d-8bb8-a0a50096cfb9"). InnerVolumeSpecName "kube-api-access-qfgm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.569683 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dfe7b56f-4954-457d-8bb8-a0a50096cfb9" (UID: "dfe7b56f-4954-457d-8bb8-a0a50096cfb9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.655973 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.656014 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfgm2\" (UniqueName: \"kubernetes.io/projected/dfe7b56f-4954-457d-8bb8-a0a50096cfb9-kube-api-access-qfgm2\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.947949 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" event={"ID":"dfe7b56f-4954-457d-8bb8-a0a50096cfb9","Type":"ContainerDied","Data":"9ec417b0fb92ea7a169c394bf23e5e292e5d100d941622f9d3876ee0cfbd387d"} Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.948000 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ec417b0fb92ea7a169c394bf23e5e292e5d100d941622f9d3876ee0cfbd387d" Feb 02 11:15:03 crc kubenswrapper[4845]: I0202 11:15:03.948001 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn" Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.533824 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg"] Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.542959 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-ncqjg"] Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.985000 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:04 crc kubenswrapper[4845]: E0202 11:15:04.985668 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe7b56f-4954-457d-8bb8-a0a50096cfb9" containerName="collect-profiles" Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.985688 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe7b56f-4954-457d-8bb8-a0a50096cfb9" containerName="collect-profiles" Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.985988 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe7b56f-4954-457d-8bb8-a0a50096cfb9" containerName="collect-profiles" Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.987940 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:04 crc kubenswrapper[4845]: I0202 11:15:04.995953 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.091373 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.091452 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq2sz\" (UniqueName: \"kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.091582 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.194472 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.194699 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.194741 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq2sz\" (UniqueName: \"kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.195077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.195204 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.212784 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq2sz\" (UniqueName: \"kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz\") pod \"redhat-marketplace-rvh42\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.310781 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.727789 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30bde55e-4121-4b71-b6f4-6cb3a9acd82e" path="/var/lib/kubelet/pods/30bde55e-4121-4b71-b6f4-6cb3a9acd82e/volumes" Feb 02 11:15:05 crc kubenswrapper[4845]: W0202 11:15:05.857110 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869bb2a7_f892_4849_ac0e_e221a7987251.slice/crio-ed60c4aa373b4a91c9b09fc640177f562ba99e067409a68ddc930f16c13babac WatchSource:0}: Error finding container ed60c4aa373b4a91c9b09fc640177f562ba99e067409a68ddc930f16c13babac: Status 404 returned error can't find the container with id ed60c4aa373b4a91c9b09fc640177f562ba99e067409a68ddc930f16c13babac Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.857881 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:05 crc kubenswrapper[4845]: I0202 11:15:05.967910 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerStarted","Data":"ed60c4aa373b4a91c9b09fc640177f562ba99e067409a68ddc930f16c13babac"} Feb 02 11:15:06 crc kubenswrapper[4845]: I0202 11:15:06.979802 4845 generic.go:334] "Generic (PLEG): container finished" podID="869bb2a7-f892-4849-ac0e-e221a7987251" containerID="26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50" exitCode=0 Feb 02 11:15:06 crc kubenswrapper[4845]: I0202 11:15:06.979940 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerDied","Data":"26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50"} Feb 02 11:15:06 crc kubenswrapper[4845]: I0202 11:15:06.984205 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:15:07 crc kubenswrapper[4845]: I0202 11:15:07.992547 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerStarted","Data":"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7"} Feb 02 11:15:10 crc kubenswrapper[4845]: I0202 11:15:10.016599 4845 generic.go:334] "Generic (PLEG): container finished" podID="869bb2a7-f892-4849-ac0e-e221a7987251" containerID="9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7" exitCode=0 Feb 02 11:15:10 crc kubenswrapper[4845]: I0202 11:15:10.016702 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerDied","Data":"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7"} Feb 02 11:15:10 crc kubenswrapper[4845]: I0202 11:15:10.820090 4845 scope.go:117] "RemoveContainer" containerID="523a5bcb42e2050e61412c7d355eedcd8752caa8ebcb7f962918e3ce965038aa" Feb 02 11:15:11 crc kubenswrapper[4845]: I0202 11:15:11.028118 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerStarted","Data":"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0"} Feb 02 11:15:15 crc kubenswrapper[4845]: I0202 11:15:15.312020 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:15 crc kubenswrapper[4845]: I0202 11:15:15.312379 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:15 crc kubenswrapper[4845]: I0202 11:15:15.360399 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:15 crc kubenswrapper[4845]: I0202 11:15:15.382089 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvh42" podStartSLOduration=7.82161947 podStartE2EDuration="11.382069045s" podCreationTimestamp="2026-02-02 11:15:04 +0000 UTC" firstStartedPulling="2026-02-02 11:15:06.98380352 +0000 UTC m=+2588.075204980" lastFinishedPulling="2026-02-02 11:15:10.544253105 +0000 UTC m=+2591.635654555" observedRunningTime="2026-02-02 11:15:11.045043649 +0000 UTC m=+2592.136445119" watchObservedRunningTime="2026-02-02 11:15:15.382069045 +0000 UTC m=+2596.473470495" Feb 02 11:15:16 crc kubenswrapper[4845]: I0202 11:15:16.191343 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:16 crc kubenswrapper[4845]: I0202 11:15:16.270126 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.101213 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rvh42" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="registry-server" containerID="cri-o://68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0" gracePeriod=2 Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.635455 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.729105 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq2sz\" (UniqueName: \"kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz\") pod \"869bb2a7-f892-4849-ac0e-e221a7987251\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.729282 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content\") pod \"869bb2a7-f892-4849-ac0e-e221a7987251\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.729335 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities\") pod \"869bb2a7-f892-4849-ac0e-e221a7987251\" (UID: \"869bb2a7-f892-4849-ac0e-e221a7987251\") " Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.730308 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities" (OuterVolumeSpecName: "utilities") pod "869bb2a7-f892-4849-ac0e-e221a7987251" (UID: "869bb2a7-f892-4849-ac0e-e221a7987251"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.735081 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz" (OuterVolumeSpecName: "kube-api-access-qq2sz") pod "869bb2a7-f892-4849-ac0e-e221a7987251" (UID: "869bb2a7-f892-4849-ac0e-e221a7987251"). InnerVolumeSpecName "kube-api-access-qq2sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.757400 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "869bb2a7-f892-4849-ac0e-e221a7987251" (UID: "869bb2a7-f892-4849-ac0e-e221a7987251"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.833864 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq2sz\" (UniqueName: \"kubernetes.io/projected/869bb2a7-f892-4849-ac0e-e221a7987251-kube-api-access-qq2sz\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.834075 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:18 crc kubenswrapper[4845]: I0202 11:15:18.834095 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/869bb2a7-f892-4849-ac0e-e221a7987251-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.114923 4845 generic.go:334] "Generic (PLEG): container finished" podID="869bb2a7-f892-4849-ac0e-e221a7987251" containerID="68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0" exitCode=0 Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.114987 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerDied","Data":"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0"} Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.115019 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvh42" event={"ID":"869bb2a7-f892-4849-ac0e-e221a7987251","Type":"ContainerDied","Data":"ed60c4aa373b4a91c9b09fc640177f562ba99e067409a68ddc930f16c13babac"} Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.115036 4845 scope.go:117] "RemoveContainer" containerID="68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.115192 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvh42" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.152334 4845 scope.go:117] "RemoveContainer" containerID="9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.190057 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.198283 4845 scope.go:117] "RemoveContainer" containerID="26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.227751 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvh42"] Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.274827 4845 scope.go:117] "RemoveContainer" containerID="68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0" Feb 02 11:15:19 crc kubenswrapper[4845]: E0202 11:15:19.275554 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0\": container with ID starting with 68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0 not found: ID does not exist" containerID="68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.275601 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0"} err="failed to get container status \"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0\": rpc error: code = NotFound desc = could not find container \"68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0\": container with ID starting with 68cd5f0ac01a36ca7a9ded3a0b4c9ef97c96705fc1fd0960be62ed0ce82d05e0 not found: ID does not exist" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.275628 4845 scope.go:117] "RemoveContainer" containerID="9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7" Feb 02 11:15:19 crc kubenswrapper[4845]: E0202 11:15:19.276378 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7\": container with ID starting with 9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7 not found: ID does not exist" containerID="9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.276505 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7"} err="failed to get container status \"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7\": rpc error: code = NotFound desc = could not find container \"9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7\": container with ID starting with 9c6da0a6bdf6ad9ffea1463a6599823f61c5157e65caf9cbdab9698c761dd0b7 not found: ID does not exist" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.276606 4845 scope.go:117] "RemoveContainer" containerID="26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50" Feb 02 11:15:19 crc kubenswrapper[4845]: E0202 11:15:19.277126 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50\": container with ID starting with 26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50 not found: ID does not exist" containerID="26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.277159 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50"} err="failed to get container status \"26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50\": rpc error: code = NotFound desc = could not find container \"26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50\": container with ID starting with 26db0c558d019a5675900b1570f42ffba048eb1fb33377cdc063d50635f56e50 not found: ID does not exist" Feb 02 11:15:19 crc kubenswrapper[4845]: I0202 11:15:19.729705 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" path="/var/lib/kubelet/pods/869bb2a7-f892-4849-ac0e-e221a7987251/volumes" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.433179 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:15:57 crc kubenswrapper[4845]: E0202 11:15:57.434315 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="registry-server" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.434332 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="registry-server" Feb 02 11:15:57 crc kubenswrapper[4845]: E0202 11:15:57.434360 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="extract-content" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.434368 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="extract-content" Feb 02 11:15:57 crc kubenswrapper[4845]: E0202 11:15:57.434414 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="extract-utilities" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.434423 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="extract-utilities" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.434715 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="869bb2a7-f892-4849-ac0e-e221a7987251" containerName="registry-server" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.436781 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.445001 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.570476 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.572911 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjbm\" (UniqueName: \"kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.572988 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.676410 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.676527 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjbm\" (UniqueName: \"kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.676550 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.677158 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.677414 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.696755 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjbm\" (UniqueName: \"kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm\") pod \"certified-operators-5s2tp\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:57 crc kubenswrapper[4845]: I0202 11:15:57.770071 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:15:58 crc kubenswrapper[4845]: I0202 11:15:58.346287 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:15:58 crc kubenswrapper[4845]: I0202 11:15:58.560693 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerStarted","Data":"0eadb76ac2e8cfb4b265a90d64feb6cfe86c5e67914191034e8e058d1dd59529"} Feb 02 11:15:59 crc kubenswrapper[4845]: I0202 11:15:59.572346 4845 generic.go:334] "Generic (PLEG): container finished" podID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerID="64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4" exitCode=0 Feb 02 11:15:59 crc kubenswrapper[4845]: I0202 11:15:59.572409 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerDied","Data":"64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4"} Feb 02 11:16:01 crc kubenswrapper[4845]: I0202 11:16:01.597502 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerStarted","Data":"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb"} Feb 02 11:16:02 crc kubenswrapper[4845]: I0202 11:16:02.609475 4845 generic.go:334] "Generic (PLEG): container finished" podID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerID="3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb" exitCode=0 Feb 02 11:16:02 crc kubenswrapper[4845]: I0202 11:16:02.609550 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerDied","Data":"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb"} Feb 02 11:16:03 crc kubenswrapper[4845]: I0202 11:16:03.623473 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerStarted","Data":"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1"} Feb 02 11:16:03 crc kubenswrapper[4845]: I0202 11:16:03.649847 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5s2tp" podStartSLOduration=3.109125584 podStartE2EDuration="6.649802917s" podCreationTimestamp="2026-02-02 11:15:57 +0000 UTC" firstStartedPulling="2026-02-02 11:15:59.574745997 +0000 UTC m=+2640.666147447" lastFinishedPulling="2026-02-02 11:16:03.11542334 +0000 UTC m=+2644.206824780" observedRunningTime="2026-02-02 11:16:03.642646389 +0000 UTC m=+2644.734047839" watchObservedRunningTime="2026-02-02 11:16:03.649802917 +0000 UTC m=+2644.741204367" Feb 02 11:16:07 crc kubenswrapper[4845]: I0202 11:16:07.770491 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:07 crc kubenswrapper[4845]: I0202 11:16:07.771086 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:07 crc kubenswrapper[4845]: I0202 11:16:07.824167 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:08 crc kubenswrapper[4845]: I0202 11:16:08.724874 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:08 crc kubenswrapper[4845]: I0202 11:16:08.785344 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:16:10 crc kubenswrapper[4845]: I0202 11:16:10.695720 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5s2tp" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="registry-server" containerID="cri-o://f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1" gracePeriod=2 Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.219253 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.318572 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jjbm\" (UniqueName: \"kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm\") pod \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.318672 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities\") pod \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.318727 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content\") pod \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\" (UID: \"6565f54e-2b32-49c5-bcca-06363f5bd2cb\") " Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.320082 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities" (OuterVolumeSpecName: "utilities") pod "6565f54e-2b32-49c5-bcca-06363f5bd2cb" (UID: "6565f54e-2b32-49c5-bcca-06363f5bd2cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.326005 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm" (OuterVolumeSpecName: "kube-api-access-8jjbm") pod "6565f54e-2b32-49c5-bcca-06363f5bd2cb" (UID: "6565f54e-2b32-49c5-bcca-06363f5bd2cb"). InnerVolumeSpecName "kube-api-access-8jjbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.375160 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6565f54e-2b32-49c5-bcca-06363f5bd2cb" (UID: "6565f54e-2b32-49c5-bcca-06363f5bd2cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.421847 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.421906 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6565f54e-2b32-49c5-bcca-06363f5bd2cb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.421926 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jjbm\" (UniqueName: \"kubernetes.io/projected/6565f54e-2b32-49c5-bcca-06363f5bd2cb-kube-api-access-8jjbm\") on node \"crc\" DevicePath \"\"" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.735623 4845 generic.go:334] "Generic (PLEG): container finished" podID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerID="f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1" exitCode=0 Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.735673 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerDied","Data":"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1"} Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.735703 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5s2tp" event={"ID":"6565f54e-2b32-49c5-bcca-06363f5bd2cb","Type":"ContainerDied","Data":"0eadb76ac2e8cfb4b265a90d64feb6cfe86c5e67914191034e8e058d1dd59529"} Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.735719 4845 scope.go:117] "RemoveContainer" containerID="f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.735735 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5s2tp" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.775672 4845 scope.go:117] "RemoveContainer" containerID="3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.828592 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.834056 4845 scope.go:117] "RemoveContainer" containerID="64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.868360 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5s2tp"] Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.953692 4845 scope.go:117] "RemoveContainer" containerID="f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1" Feb 02 11:16:11 crc kubenswrapper[4845]: E0202 11:16:11.956184 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1\": container with ID starting with f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1 not found: ID does not exist" containerID="f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.956231 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1"} err="failed to get container status \"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1\": rpc error: code = NotFound desc = could not find container \"f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1\": container with ID starting with f4aeaf1febfad31ffcdc67a3c3800e4db1149b548940bc3e4085c7d0f8dfe7d1 not found: ID does not exist" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.956258 4845 scope.go:117] "RemoveContainer" containerID="3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb" Feb 02 11:16:11 crc kubenswrapper[4845]: E0202 11:16:11.956726 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb\": container with ID starting with 3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb not found: ID does not exist" containerID="3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.956769 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb"} err="failed to get container status \"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb\": rpc error: code = NotFound desc = could not find container \"3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb\": container with ID starting with 3dc191087e15c4b911d66a8f918f69ea24d12a7212ba993341bf4fa750a064cb not found: ID does not exist" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.956788 4845 scope.go:117] "RemoveContainer" containerID="64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4" Feb 02 11:16:11 crc kubenswrapper[4845]: E0202 11:16:11.957087 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4\": container with ID starting with 64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4 not found: ID does not exist" containerID="64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4" Feb 02 11:16:11 crc kubenswrapper[4845]: I0202 11:16:11.957158 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4"} err="failed to get container status \"64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4\": rpc error: code = NotFound desc = could not find container \"64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4\": container with ID starting with 64a1356a4dc372526ff0d42d69502467f92dd313c319f02635f9d19ec0032cf4 not found: ID does not exist" Feb 02 11:16:13 crc kubenswrapper[4845]: I0202 11:16:13.728173 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" path="/var/lib/kubelet/pods/6565f54e-2b32-49c5-bcca-06363f5bd2cb/volumes" Feb 02 11:16:16 crc kubenswrapper[4845]: I0202 11:16:16.237195 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:16:16 crc kubenswrapper[4845]: I0202 11:16:16.237530 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:16:46 crc kubenswrapper[4845]: I0202 11:16:46.237815 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:16:46 crc kubenswrapper[4845]: I0202 11:16:46.238392 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.238292 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.238911 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.238974 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.240474 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.240586 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033" gracePeriod=600 Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.419987 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033" exitCode=0 Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.420076 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033"} Feb 02 11:17:16 crc kubenswrapper[4845]: I0202 11:17:16.420600 4845 scope.go:117] "RemoveContainer" containerID="1ba20d6a21bdd300599d4a9526d3a1de22ef5453da0f4e550c24dab0415a92cc" Feb 02 11:17:17 crc kubenswrapper[4845]: I0202 11:17:17.435367 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c"} Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.094570 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:16 crc kubenswrapper[4845]: E0202 11:18:16.095777 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="extract-content" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.095793 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="extract-content" Feb 02 11:18:16 crc kubenswrapper[4845]: E0202 11:18:16.095814 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="extract-utilities" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.095824 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="extract-utilities" Feb 02 11:18:16 crc kubenswrapper[4845]: E0202 11:18:16.095848 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="registry-server" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.095857 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="registry-server" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.096158 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6565f54e-2b32-49c5-bcca-06363f5bd2cb" containerName="registry-server" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.099777 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.105009 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.221203 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.221490 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.221560 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5b8\" (UniqueName: \"kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.324016 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.324246 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.324316 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx5b8\" (UniqueName: \"kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.324448 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.324599 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.347668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx5b8\" (UniqueName: \"kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8\") pod \"community-operators-sn4l9\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:16 crc kubenswrapper[4845]: I0202 11:18:16.436083 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:17 crc kubenswrapper[4845]: I0202 11:18:17.005272 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:17 crc kubenswrapper[4845]: I0202 11:18:17.120815 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerStarted","Data":"661ef6cebe116cb881e59387e7f3437734bcb869e84682751ec539128c5e9d35"} Feb 02 11:18:18 crc kubenswrapper[4845]: I0202 11:18:18.131203 4845 generic.go:334] "Generic (PLEG): container finished" podID="767d7a70-a583-4d16-abd2-675171ae5138" containerID="1772d1d13d50f55945231fffbf3a4f639758727f28938da97d680de269b43331" exitCode=0 Feb 02 11:18:18 crc kubenswrapper[4845]: I0202 11:18:18.131268 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerDied","Data":"1772d1d13d50f55945231fffbf3a4f639758727f28938da97d680de269b43331"} Feb 02 11:18:19 crc kubenswrapper[4845]: I0202 11:18:19.157765 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerStarted","Data":"86de663521576bead672c95842f6ca66e30909f7018fa47197e8fe7a74d69cdc"} Feb 02 11:18:20 crc kubenswrapper[4845]: I0202 11:18:20.169679 4845 generic.go:334] "Generic (PLEG): container finished" podID="767d7a70-a583-4d16-abd2-675171ae5138" containerID="86de663521576bead672c95842f6ca66e30909f7018fa47197e8fe7a74d69cdc" exitCode=0 Feb 02 11:18:20 crc kubenswrapper[4845]: I0202 11:18:20.169739 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerDied","Data":"86de663521576bead672c95842f6ca66e30909f7018fa47197e8fe7a74d69cdc"} Feb 02 11:18:21 crc kubenswrapper[4845]: I0202 11:18:21.186077 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerStarted","Data":"d5e2157f1eef1e64120ddf4c5a3541e0b1d93226d3ae58eff596372e1ee9778e"} Feb 02 11:18:21 crc kubenswrapper[4845]: I0202 11:18:21.221410 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sn4l9" podStartSLOduration=2.659359963 podStartE2EDuration="5.221371472s" podCreationTimestamp="2026-02-02 11:18:16 +0000 UTC" firstStartedPulling="2026-02-02 11:18:18.13406853 +0000 UTC m=+2779.225469990" lastFinishedPulling="2026-02-02 11:18:20.696080049 +0000 UTC m=+2781.787481499" observedRunningTime="2026-02-02 11:18:21.213400552 +0000 UTC m=+2782.304802002" watchObservedRunningTime="2026-02-02 11:18:21.221371472 +0000 UTC m=+2782.312772922" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.682268 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.686684 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.706467 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.744496 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.745504 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcrkv\" (UniqueName: \"kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.745560 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.848400 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcrkv\" (UniqueName: \"kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.848504 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.848807 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.849071 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.849372 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:24 crc kubenswrapper[4845]: I0202 11:18:24.867224 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcrkv\" (UniqueName: \"kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv\") pod \"redhat-operators-dmf4h\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:25 crc kubenswrapper[4845]: I0202 11:18:25.022840 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:25 crc kubenswrapper[4845]: I0202 11:18:25.562237 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.255761 4845 generic.go:334] "Generic (PLEG): container finished" podID="c6147389-3624-4919-ba37-600b9c23a55e" containerID="ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97" exitCode=0 Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.255847 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerDied","Data":"ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97"} Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.256188 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerStarted","Data":"c8397ff7e1a78cb1133cedaa7ed5721a42eb9c766cc160662ea2ab8efe50c5b6"} Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.436941 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.437503 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:26 crc kubenswrapper[4845]: I0202 11:18:26.490263 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:27 crc kubenswrapper[4845]: I0202 11:18:27.316202 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:28 crc kubenswrapper[4845]: I0202 11:18:28.278225 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerStarted","Data":"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac"} Feb 02 11:18:29 crc kubenswrapper[4845]: I0202 11:18:29.272709 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:29 crc kubenswrapper[4845]: I0202 11:18:29.288593 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sn4l9" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="registry-server" containerID="cri-o://d5e2157f1eef1e64120ddf4c5a3541e0b1d93226d3ae58eff596372e1ee9778e" gracePeriod=2 Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.301775 4845 generic.go:334] "Generic (PLEG): container finished" podID="767d7a70-a583-4d16-abd2-675171ae5138" containerID="d5e2157f1eef1e64120ddf4c5a3541e0b1d93226d3ae58eff596372e1ee9778e" exitCode=0 Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.301983 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerDied","Data":"d5e2157f1eef1e64120ddf4c5a3541e0b1d93226d3ae58eff596372e1ee9778e"} Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.755836 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.816857 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content\") pod \"767d7a70-a583-4d16-abd2-675171ae5138\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.816993 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities\") pod \"767d7a70-a583-4d16-abd2-675171ae5138\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.817033 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx5b8\" (UniqueName: \"kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8\") pod \"767d7a70-a583-4d16-abd2-675171ae5138\" (UID: \"767d7a70-a583-4d16-abd2-675171ae5138\") " Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.818858 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities" (OuterVolumeSpecName: "utilities") pod "767d7a70-a583-4d16-abd2-675171ae5138" (UID: "767d7a70-a583-4d16-abd2-675171ae5138"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.824691 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8" (OuterVolumeSpecName: "kube-api-access-jx5b8") pod "767d7a70-a583-4d16-abd2-675171ae5138" (UID: "767d7a70-a583-4d16-abd2-675171ae5138"). InnerVolumeSpecName "kube-api-access-jx5b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.869918 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "767d7a70-a583-4d16-abd2-675171ae5138" (UID: "767d7a70-a583-4d16-abd2-675171ae5138"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.919915 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.919960 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/767d7a70-a583-4d16-abd2-675171ae5138-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:30 crc kubenswrapper[4845]: I0202 11:18:30.919976 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx5b8\" (UniqueName: \"kubernetes.io/projected/767d7a70-a583-4d16-abd2-675171ae5138-kube-api-access-jx5b8\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.317204 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sn4l9" event={"ID":"767d7a70-a583-4d16-abd2-675171ae5138","Type":"ContainerDied","Data":"661ef6cebe116cb881e59387e7f3437734bcb869e84682751ec539128c5e9d35"} Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.317269 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sn4l9" Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.317631 4845 scope.go:117] "RemoveContainer" containerID="d5e2157f1eef1e64120ddf4c5a3541e0b1d93226d3ae58eff596372e1ee9778e" Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.343275 4845 scope.go:117] "RemoveContainer" containerID="86de663521576bead672c95842f6ca66e30909f7018fa47197e8fe7a74d69cdc" Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.371196 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.388581 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sn4l9"] Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.400010 4845 scope.go:117] "RemoveContainer" containerID="1772d1d13d50f55945231fffbf3a4f639758727f28938da97d680de269b43331" Feb 02 11:18:31 crc kubenswrapper[4845]: I0202 11:18:31.743031 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767d7a70-a583-4d16-abd2-675171ae5138" path="/var/lib/kubelet/pods/767d7a70-a583-4d16-abd2-675171ae5138/volumes" Feb 02 11:18:33 crc kubenswrapper[4845]: I0202 11:18:33.346769 4845 generic.go:334] "Generic (PLEG): container finished" podID="c6147389-3624-4919-ba37-600b9c23a55e" containerID="47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac" exitCode=0 Feb 02 11:18:33 crc kubenswrapper[4845]: I0202 11:18:33.346869 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerDied","Data":"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac"} Feb 02 11:18:34 crc kubenswrapper[4845]: I0202 11:18:34.359156 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerStarted","Data":"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0"} Feb 02 11:18:34 crc kubenswrapper[4845]: I0202 11:18:34.393818 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dmf4h" podStartSLOduration=2.7911557719999998 podStartE2EDuration="10.393789695s" podCreationTimestamp="2026-02-02 11:18:24 +0000 UTC" firstStartedPulling="2026-02-02 11:18:26.257583029 +0000 UTC m=+2787.348984479" lastFinishedPulling="2026-02-02 11:18:33.860216952 +0000 UTC m=+2794.951618402" observedRunningTime="2026-02-02 11:18:34.381715455 +0000 UTC m=+2795.473116905" watchObservedRunningTime="2026-02-02 11:18:34.393789695 +0000 UTC m=+2795.485191145" Feb 02 11:18:35 crc kubenswrapper[4845]: I0202 11:18:35.023100 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:35 crc kubenswrapper[4845]: I0202 11:18:35.023155 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:36 crc kubenswrapper[4845]: I0202 11:18:36.076840 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dmf4h" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" probeResult="failure" output=< Feb 02 11:18:36 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:18:36 crc kubenswrapper[4845]: > Feb 02 11:18:46 crc kubenswrapper[4845]: I0202 11:18:46.067985 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dmf4h" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" probeResult="failure" output=< Feb 02 11:18:46 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:18:46 crc kubenswrapper[4845]: > Feb 02 11:18:55 crc kubenswrapper[4845]: I0202 11:18:55.075112 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:55 crc kubenswrapper[4845]: I0202 11:18:55.141614 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:55 crc kubenswrapper[4845]: I0202 11:18:55.316758 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:56 crc kubenswrapper[4845]: I0202 11:18:56.576612 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dmf4h" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" containerID="cri-o://77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0" gracePeriod=2 Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.149428 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.316487 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content\") pod \"c6147389-3624-4919-ba37-600b9c23a55e\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.316571 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities\") pod \"c6147389-3624-4919-ba37-600b9c23a55e\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.316924 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcrkv\" (UniqueName: \"kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv\") pod \"c6147389-3624-4919-ba37-600b9c23a55e\" (UID: \"c6147389-3624-4919-ba37-600b9c23a55e\") " Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.317747 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities" (OuterVolumeSpecName: "utilities") pod "c6147389-3624-4919-ba37-600b9c23a55e" (UID: "c6147389-3624-4919-ba37-600b9c23a55e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.318717 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.324599 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv" (OuterVolumeSpecName: "kube-api-access-bcrkv") pod "c6147389-3624-4919-ba37-600b9c23a55e" (UID: "c6147389-3624-4919-ba37-600b9c23a55e"). InnerVolumeSpecName "kube-api-access-bcrkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.422336 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcrkv\" (UniqueName: \"kubernetes.io/projected/c6147389-3624-4919-ba37-600b9c23a55e-kube-api-access-bcrkv\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.454174 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6147389-3624-4919-ba37-600b9c23a55e" (UID: "c6147389-3624-4919-ba37-600b9c23a55e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.525048 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6147389-3624-4919-ba37-600b9c23a55e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.615246 4845 generic.go:334] "Generic (PLEG): container finished" podID="c6147389-3624-4919-ba37-600b9c23a55e" containerID="77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0" exitCode=0 Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.615321 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerDied","Data":"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0"} Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.615410 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dmf4h" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.615441 4845 scope.go:117] "RemoveContainer" containerID="77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.615412 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dmf4h" event={"ID":"c6147389-3624-4919-ba37-600b9c23a55e","Type":"ContainerDied","Data":"c8397ff7e1a78cb1133cedaa7ed5721a42eb9c766cc160662ea2ab8efe50c5b6"} Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.656997 4845 scope.go:117] "RemoveContainer" containerID="47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.668898 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.679665 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dmf4h"] Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.689417 4845 scope.go:117] "RemoveContainer" containerID="ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.730622 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6147389-3624-4919-ba37-600b9c23a55e" path="/var/lib/kubelet/pods/c6147389-3624-4919-ba37-600b9c23a55e/volumes" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.739654 4845 scope.go:117] "RemoveContainer" containerID="77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0" Feb 02 11:18:57 crc kubenswrapper[4845]: E0202 11:18:57.740315 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0\": container with ID starting with 77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0 not found: ID does not exist" containerID="77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.740350 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0"} err="failed to get container status \"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0\": rpc error: code = NotFound desc = could not find container \"77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0\": container with ID starting with 77fb0b9dc09d0eb23f38f1fd7e2bef60eb57f4d4c9ed9d4fb9e40e2f9558fad0 not found: ID does not exist" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.740376 4845 scope.go:117] "RemoveContainer" containerID="47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac" Feb 02 11:18:57 crc kubenswrapper[4845]: E0202 11:18:57.740981 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac\": container with ID starting with 47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac not found: ID does not exist" containerID="47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.741081 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac"} err="failed to get container status \"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac\": rpc error: code = NotFound desc = could not find container \"47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac\": container with ID starting with 47a60e934a8b6791b05d0aafe880dd84949e46ec1a413fece2b0abcc118489ac not found: ID does not exist" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.741118 4845 scope.go:117] "RemoveContainer" containerID="ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97" Feb 02 11:18:57 crc kubenswrapper[4845]: E0202 11:18:57.741475 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97\": container with ID starting with ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97 not found: ID does not exist" containerID="ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97" Feb 02 11:18:57 crc kubenswrapper[4845]: I0202 11:18:57.741498 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97"} err="failed to get container status \"ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97\": rpc error: code = NotFound desc = could not find container \"ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97\": container with ID starting with ee326ff04ed15e3ada7db0063639a37d8fcee16151491f583e9fa6fbd3ca8e97 not found: ID does not exist" Feb 02 11:19:16 crc kubenswrapper[4845]: I0202 11:19:16.238096 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:19:16 crc kubenswrapper[4845]: I0202 11:19:16.238658 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:19:46 crc kubenswrapper[4845]: I0202 11:19:46.237820 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:19:46 crc kubenswrapper[4845]: I0202 11:19:46.238452 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.237661 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.238246 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.238309 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.241525 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.241610 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" gracePeriod=600 Feb 02 11:20:16 crc kubenswrapper[4845]: E0202 11:20:16.365722 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.435468 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" exitCode=0 Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.435517 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c"} Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.435556 4845 scope.go:117] "RemoveContainer" containerID="b01c1f6b8a2da00bbbdac89a515179f56a7b50789ec892275046cb267212a033" Feb 02 11:20:16 crc kubenswrapper[4845]: I0202 11:20:16.436473 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:20:16 crc kubenswrapper[4845]: E0202 11:20:16.436927 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:20:27 crc kubenswrapper[4845]: I0202 11:20:27.712539 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:20:27 crc kubenswrapper[4845]: E0202 11:20:27.713451 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:20:39 crc kubenswrapper[4845]: I0202 11:20:39.721846 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:20:39 crc kubenswrapper[4845]: E0202 11:20:39.722754 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:20:50 crc kubenswrapper[4845]: I0202 11:20:50.712859 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:20:50 crc kubenswrapper[4845]: E0202 11:20:50.713590 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:21:01 crc kubenswrapper[4845]: I0202 11:21:01.714599 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:21:01 crc kubenswrapper[4845]: E0202 11:21:01.715502 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:21:14 crc kubenswrapper[4845]: I0202 11:21:14.713463 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:21:14 crc kubenswrapper[4845]: E0202 11:21:14.714365 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:21:27 crc kubenswrapper[4845]: I0202 11:21:27.720436 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:21:27 crc kubenswrapper[4845]: E0202 11:21:27.722878 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:21:41 crc kubenswrapper[4845]: I0202 11:21:41.717531 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:21:41 crc kubenswrapper[4845]: E0202 11:21:41.718162 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:21:53 crc kubenswrapper[4845]: I0202 11:21:53.713901 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:21:53 crc kubenswrapper[4845]: E0202 11:21:53.715269 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:22:07 crc kubenswrapper[4845]: I0202 11:22:07.713276 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:22:07 crc kubenswrapper[4845]: E0202 11:22:07.714524 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:22:22 crc kubenswrapper[4845]: I0202 11:22:22.713524 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:22:22 crc kubenswrapper[4845]: E0202 11:22:22.714289 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:22:36 crc kubenswrapper[4845]: I0202 11:22:36.712946 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:22:36 crc kubenswrapper[4845]: E0202 11:22:36.713775 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:22:47 crc kubenswrapper[4845]: I0202 11:22:47.713822 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:22:47 crc kubenswrapper[4845]: E0202 11:22:47.714526 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:23:00 crc kubenswrapper[4845]: I0202 11:23:00.712786 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:23:00 crc kubenswrapper[4845]: E0202 11:23:00.713572 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:23:15 crc kubenswrapper[4845]: I0202 11:23:15.712749 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:23:15 crc kubenswrapper[4845]: E0202 11:23:15.713878 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:23:30 crc kubenswrapper[4845]: I0202 11:23:30.713016 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:23:30 crc kubenswrapper[4845]: E0202 11:23:30.714989 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:23:42 crc kubenswrapper[4845]: I0202 11:23:42.712675 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:23:42 crc kubenswrapper[4845]: E0202 11:23:42.713430 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:23:56 crc kubenswrapper[4845]: I0202 11:23:56.713537 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:23:56 crc kubenswrapper[4845]: E0202 11:23:56.714487 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:24:09 crc kubenswrapper[4845]: I0202 11:24:09.721253 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:24:09 crc kubenswrapper[4845]: E0202 11:24:09.722217 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:24:21 crc kubenswrapper[4845]: I0202 11:24:21.712226 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:24:21 crc kubenswrapper[4845]: E0202 11:24:21.713054 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:24:33 crc kubenswrapper[4845]: I0202 11:24:33.712932 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:24:33 crc kubenswrapper[4845]: E0202 11:24:33.713854 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:24:47 crc kubenswrapper[4845]: I0202 11:24:47.713463 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:24:47 crc kubenswrapper[4845]: E0202 11:24:47.714537 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:25:02 crc kubenswrapper[4845]: I0202 11:25:02.712929 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:25:02 crc kubenswrapper[4845]: E0202 11:25:02.713780 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:25:15 crc kubenswrapper[4845]: I0202 11:25:15.715044 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:25:15 crc kubenswrapper[4845]: E0202 11:25:15.715988 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:25:26 crc kubenswrapper[4845]: I0202 11:25:26.713542 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:25:27 crc kubenswrapper[4845]: I0202 11:25:27.905865 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588"} Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.834191 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835253 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="extract-utilities" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835267 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="extract-utilities" Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835283 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="extract-content" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835289 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="extract-content" Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835322 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="extract-utilities" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835332 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="extract-utilities" Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835352 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835359 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835372 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835381 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: E0202 11:25:59.835409 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="extract-content" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835415 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="extract-content" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835621 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="767d7a70-a583-4d16-abd2-675171ae5138" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.835645 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6147389-3624-4919-ba37-600b9c23a55e" containerName="registry-server" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.837268 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.871391 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.925939 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.926091 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:25:59 crc kubenswrapper[4845]: I0202 11:25:59.926134 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvjl\" (UniqueName: \"kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.029130 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.029390 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.029421 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvjl\" (UniqueName: \"kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.029779 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.029856 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.063005 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvjl\" (UniqueName: \"kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl\") pod \"redhat-marketplace-8cjkz\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.163594 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:00 crc kubenswrapper[4845]: I0202 11:26:00.716486 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:26:01 crc kubenswrapper[4845]: I0202 11:26:01.252844 4845 generic.go:334] "Generic (PLEG): container finished" podID="24449f44-5470-48a2-b428-b5e44c302895" containerID="ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32" exitCode=0 Feb 02 11:26:01 crc kubenswrapper[4845]: I0202 11:26:01.253117 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerDied","Data":"ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32"} Feb 02 11:26:01 crc kubenswrapper[4845]: I0202 11:26:01.254326 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerStarted","Data":"59c0560168a919aad63948ffa20e93fc4981745b7daf0aacf1636b1746ee6fbf"} Feb 02 11:26:01 crc kubenswrapper[4845]: I0202 11:26:01.256555 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:26:03 crc kubenswrapper[4845]: I0202 11:26:03.280565 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerStarted","Data":"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95"} Feb 02 11:26:04 crc kubenswrapper[4845]: I0202 11:26:04.301273 4845 generic.go:334] "Generic (PLEG): container finished" podID="24449f44-5470-48a2-b428-b5e44c302895" containerID="eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95" exitCode=0 Feb 02 11:26:04 crc kubenswrapper[4845]: I0202 11:26:04.301522 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerDied","Data":"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95"} Feb 02 11:26:05 crc kubenswrapper[4845]: I0202 11:26:05.319046 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerStarted","Data":"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43"} Feb 02 11:26:05 crc kubenswrapper[4845]: I0202 11:26:05.352232 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8cjkz" podStartSLOduration=2.9102035109999997 podStartE2EDuration="6.352204489s" podCreationTimestamp="2026-02-02 11:25:59 +0000 UTC" firstStartedPulling="2026-02-02 11:26:01.256283398 +0000 UTC m=+3242.347684838" lastFinishedPulling="2026-02-02 11:26:04.698284366 +0000 UTC m=+3245.789685816" observedRunningTime="2026-02-02 11:26:05.346268028 +0000 UTC m=+3246.437669488" watchObservedRunningTime="2026-02-02 11:26:05.352204489 +0000 UTC m=+3246.443605939" Feb 02 11:26:10 crc kubenswrapper[4845]: I0202 11:26:10.164199 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:10 crc kubenswrapper[4845]: I0202 11:26:10.164764 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:10 crc kubenswrapper[4845]: I0202 11:26:10.220694 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:10 crc kubenswrapper[4845]: I0202 11:26:10.415487 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:10 crc kubenswrapper[4845]: I0202 11:26:10.465233 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:26:12 crc kubenswrapper[4845]: I0202 11:26:12.394369 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8cjkz" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="registry-server" containerID="cri-o://62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43" gracePeriod=2 Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.120159 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.284961 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content\") pod \"24449f44-5470-48a2-b428-b5e44c302895\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.285337 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvjl\" (UniqueName: \"kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl\") pod \"24449f44-5470-48a2-b428-b5e44c302895\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.285414 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities\") pod \"24449f44-5470-48a2-b428-b5e44c302895\" (UID: \"24449f44-5470-48a2-b428-b5e44c302895\") " Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.286321 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities" (OuterVolumeSpecName: "utilities") pod "24449f44-5470-48a2-b428-b5e44c302895" (UID: "24449f44-5470-48a2-b428-b5e44c302895"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.292569 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl" (OuterVolumeSpecName: "kube-api-access-4zvjl") pod "24449f44-5470-48a2-b428-b5e44c302895" (UID: "24449f44-5470-48a2-b428-b5e44c302895"). InnerVolumeSpecName "kube-api-access-4zvjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.308611 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24449f44-5470-48a2-b428-b5e44c302895" (UID: "24449f44-5470-48a2-b428-b5e44c302895"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.388677 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.388985 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvjl\" (UniqueName: \"kubernetes.io/projected/24449f44-5470-48a2-b428-b5e44c302895-kube-api-access-4zvjl\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.389101 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24449f44-5470-48a2-b428-b5e44c302895-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.406672 4845 generic.go:334] "Generic (PLEG): container finished" podID="24449f44-5470-48a2-b428-b5e44c302895" containerID="62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43" exitCode=0 Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.406717 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerDied","Data":"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43"} Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.406744 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8cjkz" event={"ID":"24449f44-5470-48a2-b428-b5e44c302895","Type":"ContainerDied","Data":"59c0560168a919aad63948ffa20e93fc4981745b7daf0aacf1636b1746ee6fbf"} Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.406759 4845 scope.go:117] "RemoveContainer" containerID="62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.406794 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8cjkz" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.429355 4845 scope.go:117] "RemoveContainer" containerID="eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.444618 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.454875 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8cjkz"] Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.469651 4845 scope.go:117] "RemoveContainer" containerID="ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.506204 4845 scope.go:117] "RemoveContainer" containerID="62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43" Feb 02 11:26:13 crc kubenswrapper[4845]: E0202 11:26:13.506624 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43\": container with ID starting with 62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43 not found: ID does not exist" containerID="62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.506661 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43"} err="failed to get container status \"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43\": rpc error: code = NotFound desc = could not find container \"62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43\": container with ID starting with 62ebdb0a0cc61aa20bf0bf3d31cb2ecfc40e40dd77158fec5f66249eff6b6d43 not found: ID does not exist" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.506686 4845 scope.go:117] "RemoveContainer" containerID="eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95" Feb 02 11:26:13 crc kubenswrapper[4845]: E0202 11:26:13.506963 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95\": container with ID starting with eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95 not found: ID does not exist" containerID="eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.507043 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95"} err="failed to get container status \"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95\": rpc error: code = NotFound desc = could not find container \"eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95\": container with ID starting with eb2701e64c29a4841dd02df677cae1738dcaaafb26704dfd153aaf007130ea95 not found: ID does not exist" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.507105 4845 scope.go:117] "RemoveContainer" containerID="ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32" Feb 02 11:26:13 crc kubenswrapper[4845]: E0202 11:26:13.507471 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32\": container with ID starting with ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32 not found: ID does not exist" containerID="ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.507566 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32"} err="failed to get container status \"ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32\": rpc error: code = NotFound desc = could not find container \"ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32\": container with ID starting with ebcbc35da1b150597e117b27e957c561c3b786327ecabc4dfacd2d1ecd84cf32 not found: ID does not exist" Feb 02 11:26:13 crc kubenswrapper[4845]: I0202 11:26:13.727315 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24449f44-5470-48a2-b428-b5e44c302895" path="/var/lib/kubelet/pods/24449f44-5470-48a2-b428-b5e44c302895/volumes" Feb 02 11:27:46 crc kubenswrapper[4845]: I0202 11:27:46.237751 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:27:46 crc kubenswrapper[4845]: I0202 11:27:46.239104 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:28:16 crc kubenswrapper[4845]: I0202 11:28:16.237429 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:28:16 crc kubenswrapper[4845]: I0202 11:28:16.237970 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:28:46 crc kubenswrapper[4845]: I0202 11:28:46.237694 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:28:46 crc kubenswrapper[4845]: I0202 11:28:46.238337 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:28:46 crc kubenswrapper[4845]: I0202 11:28:46.238389 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:28:46 crc kubenswrapper[4845]: I0202 11:28:46.239483 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:28:46 crc kubenswrapper[4845]: I0202 11:28:46.239539 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588" gracePeriod=600 Feb 02 11:28:47 crc kubenswrapper[4845]: I0202 11:28:47.316323 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588" exitCode=0 Feb 02 11:28:47 crc kubenswrapper[4845]: I0202 11:28:47.316417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588"} Feb 02 11:28:47 crc kubenswrapper[4845]: I0202 11:28:47.316921 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b"} Feb 02 11:28:47 crc kubenswrapper[4845]: I0202 11:28:47.316951 4845 scope.go:117] "RemoveContainer" containerID="36183e2dfd6aaef1824bcd769ccb254bb6eb7d7237f3d8a43eef04603719266c" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.593831 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:16 crc kubenswrapper[4845]: E0202 11:29:16.594951 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="extract-utilities" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.594970 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="extract-utilities" Feb 02 11:29:16 crc kubenswrapper[4845]: E0202 11:29:16.594988 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="extract-content" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.594995 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="extract-content" Feb 02 11:29:16 crc kubenswrapper[4845]: E0202 11:29:16.595038 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="registry-server" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.595046 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="registry-server" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.595309 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="24449f44-5470-48a2-b428-b5e44c302895" containerName="registry-server" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.597497 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.603638 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.603764 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.603939 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svt7g\" (UniqueName: \"kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.613758 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.705980 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.706060 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.706165 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svt7g\" (UniqueName: \"kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.706491 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.706548 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.727787 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svt7g\" (UniqueName: \"kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g\") pod \"redhat-operators-z6b55\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:16 crc kubenswrapper[4845]: I0202 11:29:16.920840 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:17 crc kubenswrapper[4845]: I0202 11:29:17.523198 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:17 crc kubenswrapper[4845]: I0202 11:29:17.647468 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerStarted","Data":"154df5129783a3ccccf1aeba99b7d58286cbaa89ab345a93ba51357b26f227dd"} Feb 02 11:29:18 crc kubenswrapper[4845]: I0202 11:29:18.658497 4845 generic.go:334] "Generic (PLEG): container finished" podID="61834d50-0e84-4db8-9777-640ca6b26d60" containerID="f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5" exitCode=0 Feb 02 11:29:18 crc kubenswrapper[4845]: I0202 11:29:18.658701 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerDied","Data":"f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5"} Feb 02 11:29:20 crc kubenswrapper[4845]: I0202 11:29:20.679217 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerStarted","Data":"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9"} Feb 02 11:29:27 crc kubenswrapper[4845]: I0202 11:29:27.748924 4845 generic.go:334] "Generic (PLEG): container finished" podID="61834d50-0e84-4db8-9777-640ca6b26d60" containerID="01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9" exitCode=0 Feb 02 11:29:27 crc kubenswrapper[4845]: I0202 11:29:27.749027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerDied","Data":"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9"} Feb 02 11:29:28 crc kubenswrapper[4845]: I0202 11:29:28.764721 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerStarted","Data":"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50"} Feb 02 11:29:28 crc kubenswrapper[4845]: I0202 11:29:28.796911 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z6b55" podStartSLOduration=3.30269963 podStartE2EDuration="12.796867527s" podCreationTimestamp="2026-02-02 11:29:16 +0000 UTC" firstStartedPulling="2026-02-02 11:29:18.660446574 +0000 UTC m=+3439.751848024" lastFinishedPulling="2026-02-02 11:29:28.154614471 +0000 UTC m=+3449.246015921" observedRunningTime="2026-02-02 11:29:28.786102127 +0000 UTC m=+3449.877503597" watchObservedRunningTime="2026-02-02 11:29:28.796867527 +0000 UTC m=+3449.888268977" Feb 02 11:29:36 crc kubenswrapper[4845]: I0202 11:29:36.921422 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:36 crc kubenswrapper[4845]: I0202 11:29:36.921979 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:37 crc kubenswrapper[4845]: I0202 11:29:37.976627 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z6b55" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="registry-server" probeResult="failure" output=< Feb 02 11:29:37 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:29:37 crc kubenswrapper[4845]: > Feb 02 11:29:46 crc kubenswrapper[4845]: I0202 11:29:46.982898 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:47 crc kubenswrapper[4845]: I0202 11:29:47.055343 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:47 crc kubenswrapper[4845]: I0202 11:29:47.799784 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:48 crc kubenswrapper[4845]: I0202 11:29:48.970075 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z6b55" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="registry-server" containerID="cri-o://ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50" gracePeriod=2 Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.632706 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.705452 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities\") pod \"61834d50-0e84-4db8-9777-640ca6b26d60\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.705848 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svt7g\" (UniqueName: \"kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g\") pod \"61834d50-0e84-4db8-9777-640ca6b26d60\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.706124 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content\") pod \"61834d50-0e84-4db8-9777-640ca6b26d60\" (UID: \"61834d50-0e84-4db8-9777-640ca6b26d60\") " Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.707787 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities" (OuterVolumeSpecName: "utilities") pod "61834d50-0e84-4db8-9777-640ca6b26d60" (UID: "61834d50-0e84-4db8-9777-640ca6b26d60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.725173 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g" (OuterVolumeSpecName: "kube-api-access-svt7g") pod "61834d50-0e84-4db8-9777-640ca6b26d60" (UID: "61834d50-0e84-4db8-9777-640ca6b26d60"). InnerVolumeSpecName "kube-api-access-svt7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.813788 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.813819 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svt7g\" (UniqueName: \"kubernetes.io/projected/61834d50-0e84-4db8-9777-640ca6b26d60-kube-api-access-svt7g\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.952230 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61834d50-0e84-4db8-9777-640ca6b26d60" (UID: "61834d50-0e84-4db8-9777-640ca6b26d60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.984834 4845 generic.go:334] "Generic (PLEG): container finished" podID="61834d50-0e84-4db8-9777-640ca6b26d60" containerID="ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50" exitCode=0 Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.984899 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerDied","Data":"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50"} Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.984931 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z6b55" event={"ID":"61834d50-0e84-4db8-9777-640ca6b26d60","Type":"ContainerDied","Data":"154df5129783a3ccccf1aeba99b7d58286cbaa89ab345a93ba51357b26f227dd"} Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.984952 4845 scope.go:117] "RemoveContainer" containerID="ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50" Feb 02 11:29:49 crc kubenswrapper[4845]: I0202 11:29:49.985143 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z6b55" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.019776 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61834d50-0e84-4db8-9777-640ca6b26d60-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.025965 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.030139 4845 scope.go:117] "RemoveContainer" containerID="01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.045487 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z6b55"] Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.081730 4845 scope.go:117] "RemoveContainer" containerID="f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.118988 4845 scope.go:117] "RemoveContainer" containerID="ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50" Feb 02 11:29:50 crc kubenswrapper[4845]: E0202 11:29:50.119474 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50\": container with ID starting with ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50 not found: ID does not exist" containerID="ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.119533 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50"} err="failed to get container status \"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50\": rpc error: code = NotFound desc = could not find container \"ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50\": container with ID starting with ba6394952690016e2bc561e4ff8101ae0b164aeb25b7f3a80c0d6c9d6b83cd50 not found: ID does not exist" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.119570 4845 scope.go:117] "RemoveContainer" containerID="01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9" Feb 02 11:29:50 crc kubenswrapper[4845]: E0202 11:29:50.120292 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9\": container with ID starting with 01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9 not found: ID does not exist" containerID="01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.121131 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9"} err="failed to get container status \"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9\": rpc error: code = NotFound desc = could not find container \"01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9\": container with ID starting with 01501528ca60a8de0900c04152d80c68fe32fef3a5fe3bc7c8f3cf138b87eac9 not found: ID does not exist" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.121177 4845 scope.go:117] "RemoveContainer" containerID="f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5" Feb 02 11:29:50 crc kubenswrapper[4845]: E0202 11:29:50.121507 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5\": container with ID starting with f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5 not found: ID does not exist" containerID="f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5" Feb 02 11:29:50 crc kubenswrapper[4845]: I0202 11:29:50.121533 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5"} err="failed to get container status \"f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5\": rpc error: code = NotFound desc = could not find container \"f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5\": container with ID starting with f4a0be7c873dd9dddb0910949bd2bbd98096db4082af9bf6827836b0050900b5 not found: ID does not exist" Feb 02 11:29:51 crc kubenswrapper[4845]: I0202 11:29:51.728101 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" path="/var/lib/kubelet/pods/61834d50-0e84-4db8-9777-640ca6b26d60/volumes" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.205786 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv"] Feb 02 11:30:00 crc kubenswrapper[4845]: E0202 11:30:00.209708 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.209833 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4845]: E0202 11:30:00.209946 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.210028 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4845]: E0202 11:30:00.210119 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.210207 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.210602 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="61834d50-0e84-4db8-9777-640ca6b26d60" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.212533 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.222936 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.223356 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.227160 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv"] Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.358410 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.358926 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx642\" (UniqueName: \"kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.359218 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.462502 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.462602 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx642\" (UniqueName: \"kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.462671 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.464006 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.472132 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.488111 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx642\" (UniqueName: \"kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642\") pod \"collect-profiles-29500530-j6pnv\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:00 crc kubenswrapper[4845]: I0202 11:30:00.552867 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:01 crc kubenswrapper[4845]: I0202 11:30:01.150001 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv"] Feb 02 11:30:02 crc kubenswrapper[4845]: I0202 11:30:02.143511 4845 generic.go:334] "Generic (PLEG): container finished" podID="3a82306d-7b9e-4626-b188-0f5949bb75d5" containerID="edb6987761faadf9de2625a5fc2838557508106560e93a9005dcd6dd2f5f7b4f" exitCode=0 Feb 02 11:30:02 crc kubenswrapper[4845]: I0202 11:30:02.143641 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" event={"ID":"3a82306d-7b9e-4626-b188-0f5949bb75d5","Type":"ContainerDied","Data":"edb6987761faadf9de2625a5fc2838557508106560e93a9005dcd6dd2f5f7b4f"} Feb 02 11:30:02 crc kubenswrapper[4845]: I0202 11:30:02.144173 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" event={"ID":"3a82306d-7b9e-4626-b188-0f5949bb75d5","Type":"ContainerStarted","Data":"998b19e9c77f4187ea614be7e1eaa6c1e2f0406f6cc7acd15c0aa5604ea7c105"} Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.716675 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.790781 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume\") pod \"3a82306d-7b9e-4626-b188-0f5949bb75d5\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.791362 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume\") pod \"3a82306d-7b9e-4626-b188-0f5949bb75d5\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.791467 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx642\" (UniqueName: \"kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642\") pod \"3a82306d-7b9e-4626-b188-0f5949bb75d5\" (UID: \"3a82306d-7b9e-4626-b188-0f5949bb75d5\") " Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.792633 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a82306d-7b9e-4626-b188-0f5949bb75d5" (UID: "3a82306d-7b9e-4626-b188-0f5949bb75d5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.793629 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a82306d-7b9e-4626-b188-0f5949bb75d5-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.799781 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a82306d-7b9e-4626-b188-0f5949bb75d5" (UID: "3a82306d-7b9e-4626-b188-0f5949bb75d5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.804758 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642" (OuterVolumeSpecName: "kube-api-access-xx642") pod "3a82306d-7b9e-4626-b188-0f5949bb75d5" (UID: "3a82306d-7b9e-4626-b188-0f5949bb75d5"). InnerVolumeSpecName "kube-api-access-xx642". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.897124 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a82306d-7b9e-4626-b188-0f5949bb75d5-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4845]: I0202 11:30:03.897650 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx642\" (UniqueName: \"kubernetes.io/projected/3a82306d-7b9e-4626-b188-0f5949bb75d5-kube-api-access-xx642\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:04 crc kubenswrapper[4845]: I0202 11:30:04.179933 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" event={"ID":"3a82306d-7b9e-4626-b188-0f5949bb75d5","Type":"ContainerDied","Data":"998b19e9c77f4187ea614be7e1eaa6c1e2f0406f6cc7acd15c0aa5604ea7c105"} Feb 02 11:30:04 crc kubenswrapper[4845]: I0202 11:30:04.180004 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-j6pnv" Feb 02 11:30:04 crc kubenswrapper[4845]: I0202 11:30:04.180019 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="998b19e9c77f4187ea614be7e1eaa6c1e2f0406f6cc7acd15c0aa5604ea7c105" Feb 02 11:30:04 crc kubenswrapper[4845]: I0202 11:30:04.820421 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq"] Feb 02 11:30:04 crc kubenswrapper[4845]: I0202 11:30:04.833114 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-wl2wq"] Feb 02 11:30:05 crc kubenswrapper[4845]: I0202 11:30:05.732794 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af5c06e-cf07-4f85-97e9-6b93ec03281c" path="/var/lib/kubelet/pods/6af5c06e-cf07-4f85-97e9-6b93ec03281c/volumes" Feb 02 11:30:11 crc kubenswrapper[4845]: I0202 11:30:11.261849 4845 scope.go:117] "RemoveContainer" containerID="a18e0c7c2dae09d3f5627d9a9acf7414e19ce7a97f56c94017a2bc4812f89130" Feb 02 11:30:46 crc kubenswrapper[4845]: I0202 11:30:46.237903 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:30:46 crc kubenswrapper[4845]: I0202 11:30:46.238495 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.212390 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:07 crc kubenswrapper[4845]: E0202 11:31:07.215625 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a82306d-7b9e-4626-b188-0f5949bb75d5" containerName="collect-profiles" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.215648 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a82306d-7b9e-4626-b188-0f5949bb75d5" containerName="collect-profiles" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.215989 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a82306d-7b9e-4626-b188-0f5949bb75d5" containerName="collect-profiles" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.219223 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.238025 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.330342 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.330437 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn65q\" (UniqueName: \"kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.331665 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.413502 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.416232 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.430336 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.435736 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.435788 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.436407 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.436418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.436454 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.436585 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn65q\" (UniqueName: \"kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.436710 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz2rw\" (UniqueName: \"kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.437294 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.467489 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn65q\" (UniqueName: \"kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q\") pod \"certified-operators-tz54l\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.541119 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.541413 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.541487 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz2rw\" (UniqueName: \"kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.541695 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.541979 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.547668 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.577813 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz2rw\" (UniqueName: \"kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw\") pod \"community-operators-bw7j2\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:07 crc kubenswrapper[4845]: I0202 11:31:07.749596 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.223990 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.598743 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.989624 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerID="acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba" exitCode=0 Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.989681 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerDied","Data":"acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba"} Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.989958 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerStarted","Data":"cd11d8ef1250d6350601d85867207b1fddc0dbb1f1a948ede5f7446e720e4011"} Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.991745 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.993095 4845 generic.go:334] "Generic (PLEG): container finished" podID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerID="35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436" exitCode=0 Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.993132 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerDied","Data":"35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436"} Feb 02 11:31:08 crc kubenswrapper[4845]: I0202 11:31:08.993154 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerStarted","Data":"1ac47ff9ae71bd0e30bcc2ba96930aff3367febb4799071e018c1f23e5d1db73"} Feb 02 11:31:12 crc kubenswrapper[4845]: I0202 11:31:12.036647 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerStarted","Data":"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08"} Feb 02 11:31:12 crc kubenswrapper[4845]: I0202 11:31:12.045655 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerStarted","Data":"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565"} Feb 02 11:31:13 crc kubenswrapper[4845]: I0202 11:31:13.062427 4845 generic.go:334] "Generic (PLEG): container finished" podID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerID="cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565" exitCode=0 Feb 02 11:31:13 crc kubenswrapper[4845]: I0202 11:31:13.062538 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerDied","Data":"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565"} Feb 02 11:31:14 crc kubenswrapper[4845]: I0202 11:31:14.077049 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerID="0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08" exitCode=0 Feb 02 11:31:14 crc kubenswrapper[4845]: I0202 11:31:14.077114 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerDied","Data":"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08"} Feb 02 11:31:15 crc kubenswrapper[4845]: I0202 11:31:15.094165 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerStarted","Data":"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d"} Feb 02 11:31:15 crc kubenswrapper[4845]: I0202 11:31:15.097914 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerStarted","Data":"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022"} Feb 02 11:31:15 crc kubenswrapper[4845]: I0202 11:31:15.115650 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tz54l" podStartSLOduration=2.62289833 podStartE2EDuration="8.11563035s" podCreationTimestamp="2026-02-02 11:31:07 +0000 UTC" firstStartedPulling="2026-02-02 11:31:08.991491825 +0000 UTC m=+3550.082893275" lastFinishedPulling="2026-02-02 11:31:14.484223835 +0000 UTC m=+3555.575625295" observedRunningTime="2026-02-02 11:31:15.11459131 +0000 UTC m=+3556.205992770" watchObservedRunningTime="2026-02-02 11:31:15.11563035 +0000 UTC m=+3556.207031790" Feb 02 11:31:15 crc kubenswrapper[4845]: I0202 11:31:15.139371 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bw7j2" podStartSLOduration=2.991077367 podStartE2EDuration="8.13934796s" podCreationTimestamp="2026-02-02 11:31:07 +0000 UTC" firstStartedPulling="2026-02-02 11:31:08.995003125 +0000 UTC m=+3550.086404575" lastFinishedPulling="2026-02-02 11:31:14.143273718 +0000 UTC m=+3555.234675168" observedRunningTime="2026-02-02 11:31:15.134075499 +0000 UTC m=+3556.225476959" watchObservedRunningTime="2026-02-02 11:31:15.13934796 +0000 UTC m=+3556.230749420" Feb 02 11:31:16 crc kubenswrapper[4845]: I0202 11:31:16.238429 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:31:16 crc kubenswrapper[4845]: I0202 11:31:16.239365 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.548150 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.548433 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.609938 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.750339 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.750445 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:17 crc kubenswrapper[4845]: I0202 11:31:17.806994 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:27 crc kubenswrapper[4845]: I0202 11:31:27.610197 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:27 crc kubenswrapper[4845]: I0202 11:31:27.813537 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:32 crc kubenswrapper[4845]: I0202 11:31:32.197159 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:32 crc kubenswrapper[4845]: I0202 11:31:32.198084 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tz54l" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="registry-server" containerID="cri-o://8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d" gracePeriod=2 Feb 02 11:31:32 crc kubenswrapper[4845]: I0202 11:31:32.594287 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:32 crc kubenswrapper[4845]: I0202 11:31:32.594798 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bw7j2" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="registry-server" containerID="cri-o://5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022" gracePeriod=2 Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.477461 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.484803 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.555872 4845 generic.go:334] "Generic (PLEG): container finished" podID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerID="8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d" exitCode=0 Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.555961 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerDied","Data":"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d"} Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.556027 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tz54l" event={"ID":"ec8ab263-5831-40ac-aa87-f187c2b59314","Type":"ContainerDied","Data":"cd11d8ef1250d6350601d85867207b1fddc0dbb1f1a948ede5f7446e720e4011"} Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.556050 4845 scope.go:117] "RemoveContainer" containerID="8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.556281 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tz54l" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.560169 4845 generic.go:334] "Generic (PLEG): container finished" podID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerID="5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022" exitCode=0 Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.560247 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerDied","Data":"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022"} Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.560282 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw7j2" event={"ID":"84c47002-db31-44bf-8691-c1a21ba5a78e","Type":"ContainerDied","Data":"1ac47ff9ae71bd0e30bcc2ba96930aff3367febb4799071e018c1f23e5d1db73"} Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.560304 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw7j2" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.580963 4845 scope.go:117] "RemoveContainer" containerID="0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.610677 4845 scope.go:117] "RemoveContainer" containerID="acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.624650 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz2rw\" (UniqueName: \"kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw\") pod \"84c47002-db31-44bf-8691-c1a21ba5a78e\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.624781 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities\") pod \"ec8ab263-5831-40ac-aa87-f187c2b59314\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.624825 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content\") pod \"ec8ab263-5831-40ac-aa87-f187c2b59314\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.624993 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities\") pod \"84c47002-db31-44bf-8691-c1a21ba5a78e\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.625227 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content\") pod \"84c47002-db31-44bf-8691-c1a21ba5a78e\" (UID: \"84c47002-db31-44bf-8691-c1a21ba5a78e\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.625289 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn65q\" (UniqueName: \"kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q\") pod \"ec8ab263-5831-40ac-aa87-f187c2b59314\" (UID: \"ec8ab263-5831-40ac-aa87-f187c2b59314\") " Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.626028 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities" (OuterVolumeSpecName: "utilities") pod "ec8ab263-5831-40ac-aa87-f187c2b59314" (UID: "ec8ab263-5831-40ac-aa87-f187c2b59314"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.626142 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities" (OuterVolumeSpecName: "utilities") pod "84c47002-db31-44bf-8691-c1a21ba5a78e" (UID: "84c47002-db31-44bf-8691-c1a21ba5a78e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.627227 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.627258 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.632061 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q" (OuterVolumeSpecName: "kube-api-access-fn65q") pod "ec8ab263-5831-40ac-aa87-f187c2b59314" (UID: "ec8ab263-5831-40ac-aa87-f187c2b59314"). InnerVolumeSpecName "kube-api-access-fn65q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.632446 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw" (OuterVolumeSpecName: "kube-api-access-hz2rw") pod "84c47002-db31-44bf-8691-c1a21ba5a78e" (UID: "84c47002-db31-44bf-8691-c1a21ba5a78e"). InnerVolumeSpecName "kube-api-access-hz2rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.687428 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec8ab263-5831-40ac-aa87-f187c2b59314" (UID: "ec8ab263-5831-40ac-aa87-f187c2b59314"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.690243 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84c47002-db31-44bf-8691-c1a21ba5a78e" (UID: "84c47002-db31-44bf-8691-c1a21ba5a78e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.727066 4845 scope.go:117] "RemoveContainer" containerID="8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.727762 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d\": container with ID starting with 8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d not found: ID does not exist" containerID="8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.727869 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d"} err="failed to get container status \"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d\": rpc error: code = NotFound desc = could not find container \"8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d\": container with ID starting with 8669c08da5bdec9a653b0545fb0c894b2b341be63906f2d7c4a708a4958d0c7d not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.728017 4845 scope.go:117] "RemoveContainer" containerID="0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.728524 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08\": container with ID starting with 0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08 not found: ID does not exist" containerID="0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.728672 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08"} err="failed to get container status \"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08\": rpc error: code = NotFound desc = could not find container \"0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08\": container with ID starting with 0742c7daa74690d966519991d686ec68220a1124cd496cb9a89fa968d5341f08 not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.728742 4845 scope.go:117] "RemoveContainer" containerID="acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.729208 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba\": container with ID starting with acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba not found: ID does not exist" containerID="acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729325 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba"} err="failed to get container status \"acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba\": rpc error: code = NotFound desc = could not find container \"acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba\": container with ID starting with acecf1d7f72e342f32e9f7baa1fdfed8efa08f9a624e182d96067aaee43590ba not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729436 4845 scope.go:117] "RemoveContainer" containerID="5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729551 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz2rw\" (UniqueName: \"kubernetes.io/projected/84c47002-db31-44bf-8691-c1a21ba5a78e-kube-api-access-hz2rw\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729630 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec8ab263-5831-40ac-aa87-f187c2b59314-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729645 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84c47002-db31-44bf-8691-c1a21ba5a78e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.729658 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn65q\" (UniqueName: \"kubernetes.io/projected/ec8ab263-5831-40ac-aa87-f187c2b59314-kube-api-access-fn65q\") on node \"crc\" DevicePath \"\"" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.800222 4845 scope.go:117] "RemoveContainer" containerID="cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.829942 4845 scope.go:117] "RemoveContainer" containerID="35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.915030 4845 scope.go:117] "RemoveContainer" containerID="5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.916780 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022\": container with ID starting with 5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022 not found: ID does not exist" containerID="5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.916834 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022"} err="failed to get container status \"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022\": rpc error: code = NotFound desc = could not find container \"5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022\": container with ID starting with 5e76543513530e1ce52e8f23a6b0e52de4aa0c7016831a0149aecf1b2f9d7022 not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.916872 4845 scope.go:117] "RemoveContainer" containerID="cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.917383 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565\": container with ID starting with cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565 not found: ID does not exist" containerID="cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.917471 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565"} err="failed to get container status \"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565\": rpc error: code = NotFound desc = could not find container \"cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565\": container with ID starting with cf97ada089c899c976b0f969e8b4a9c10c10d10f5c77c914b24af209ddf1a565 not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.917544 4845 scope.go:117] "RemoveContainer" containerID="35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436" Feb 02 11:31:34 crc kubenswrapper[4845]: E0202 11:31:34.917971 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436\": container with ID starting with 35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436 not found: ID does not exist" containerID="35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.918011 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436"} err="failed to get container status \"35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436\": rpc error: code = NotFound desc = could not find container \"35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436\": container with ID starting with 35fc1714ec66a391ab718f3d9c895515ba095df3d443d24b7256dfb071c8d436 not found: ID does not exist" Feb 02 11:31:34 crc kubenswrapper[4845]: I0202 11:31:34.996168 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:35 crc kubenswrapper[4845]: I0202 11:31:35.009331 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tz54l"] Feb 02 11:31:35 crc kubenswrapper[4845]: I0202 11:31:35.020120 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:35 crc kubenswrapper[4845]: I0202 11:31:35.033707 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bw7j2"] Feb 02 11:31:35 crc kubenswrapper[4845]: I0202 11:31:35.727840 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" path="/var/lib/kubelet/pods/84c47002-db31-44bf-8691-c1a21ba5a78e/volumes" Feb 02 11:31:35 crc kubenswrapper[4845]: I0202 11:31:35.729242 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" path="/var/lib/kubelet/pods/ec8ab263-5831-40ac-aa87-f187c2b59314/volumes" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.237475 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.238154 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.238209 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.239107 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.239179 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" gracePeriod=600 Feb 02 11:31:46 crc kubenswrapper[4845]: E0202 11:31:46.361270 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.681212 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" exitCode=0 Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.681250 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b"} Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.681289 4845 scope.go:117] "RemoveContainer" containerID="d56234b2e16d7741ff4ae8dc5b5dbf48186b954479748d7114398eb29a827588" Feb 02 11:31:46 crc kubenswrapper[4845]: I0202 11:31:46.683701 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:31:46 crc kubenswrapper[4845]: E0202 11:31:46.684517 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:32:01 crc kubenswrapper[4845]: I0202 11:32:01.712956 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:32:01 crc kubenswrapper[4845]: E0202 11:32:01.713712 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:32:16 crc kubenswrapper[4845]: I0202 11:32:16.712930 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:32:16 crc kubenswrapper[4845]: E0202 11:32:16.713696 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:32:29 crc kubenswrapper[4845]: I0202 11:32:29.723821 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:32:29 crc kubenswrapper[4845]: E0202 11:32:29.724654 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:32:40 crc kubenswrapper[4845]: I0202 11:32:40.713031 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:32:40 crc kubenswrapper[4845]: E0202 11:32:40.715119 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:32:55 crc kubenswrapper[4845]: I0202 11:32:55.713548 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:32:55 crc kubenswrapper[4845]: E0202 11:32:55.714406 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:33:09 crc kubenswrapper[4845]: I0202 11:33:09.713724 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:33:09 crc kubenswrapper[4845]: E0202 11:33:09.714587 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:33:24 crc kubenswrapper[4845]: I0202 11:33:24.713779 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:33:24 crc kubenswrapper[4845]: E0202 11:33:24.715072 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:33:38 crc kubenswrapper[4845]: I0202 11:33:38.713191 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:33:38 crc kubenswrapper[4845]: E0202 11:33:38.714076 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:33:50 crc kubenswrapper[4845]: I0202 11:33:50.714425 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:33:50 crc kubenswrapper[4845]: E0202 11:33:50.715225 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:34:04 crc kubenswrapper[4845]: I0202 11:34:04.713415 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:34:04 crc kubenswrapper[4845]: E0202 11:34:04.715762 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:34:17 crc kubenswrapper[4845]: I0202 11:34:17.714071 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:34:17 crc kubenswrapper[4845]: E0202 11:34:17.715611 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:34:31 crc kubenswrapper[4845]: I0202 11:34:31.713476 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:34:31 crc kubenswrapper[4845]: E0202 11:34:31.714553 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:34:43 crc kubenswrapper[4845]: I0202 11:34:43.713805 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:34:43 crc kubenswrapper[4845]: E0202 11:34:43.714503 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:34:56 crc kubenswrapper[4845]: I0202 11:34:56.713548 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:34:56 crc kubenswrapper[4845]: E0202 11:34:56.714776 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:35:07 crc kubenswrapper[4845]: I0202 11:35:07.714011 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:35:07 crc kubenswrapper[4845]: E0202 11:35:07.714758 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:35:21 crc kubenswrapper[4845]: I0202 11:35:21.713965 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:35:21 crc kubenswrapper[4845]: E0202 11:35:21.715820 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:35:32 crc kubenswrapper[4845]: I0202 11:35:32.713111 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:35:32 crc kubenswrapper[4845]: E0202 11:35:32.714156 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:35:45 crc kubenswrapper[4845]: I0202 11:35:45.714450 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:35:45 crc kubenswrapper[4845]: E0202 11:35:45.715409 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:35:57 crc kubenswrapper[4845]: I0202 11:35:57.713872 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:35:57 crc kubenswrapper[4845]: E0202 11:35:57.714837 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:36:08 crc kubenswrapper[4845]: I0202 11:36:08.713374 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:36:08 crc kubenswrapper[4845]: E0202 11:36:08.714419 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:36:23 crc kubenswrapper[4845]: I0202 11:36:23.714178 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:36:23 crc kubenswrapper[4845]: E0202 11:36:23.715124 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:36:36 crc kubenswrapper[4845]: I0202 11:36:36.713631 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:36:36 crc kubenswrapper[4845]: E0202 11:36:36.714656 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:36:50 crc kubenswrapper[4845]: I0202 11:36:50.712678 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:36:52 crc kubenswrapper[4845]: I0202 11:36:52.411936 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0"} Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.358585 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360151 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="extract-utilities" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360172 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="extract-utilities" Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360190 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="extract-utilities" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360197 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="extract-utilities" Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360219 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360229 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360247 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360254 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360281 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="extract-content" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360288 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="extract-content" Feb 02 11:38:58 crc kubenswrapper[4845]: E0202 11:38:58.360317 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="extract-content" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360324 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="extract-content" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360568 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8ab263-5831-40ac-aa87-f187c2b59314" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.360586 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="84c47002-db31-44bf-8691-c1a21ba5a78e" containerName="registry-server" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.363120 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.379944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.489363 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.489954 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.490099 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8fm\" (UniqueName: \"kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.592040 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.592179 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8fm\" (UniqueName: \"kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.592209 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.592631 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.592630 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.628024 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8fm\" (UniqueName: \"kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm\") pod \"redhat-marketplace-sh64l\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:58 crc kubenswrapper[4845]: I0202 11:38:58.686455 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:38:59 crc kubenswrapper[4845]: I0202 11:38:59.236472 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:38:59 crc kubenswrapper[4845]: W0202 11:38:59.238503 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67feb8df_07b7_4752_953d_fa9c66d6504f.slice/crio-b51a14550ffae387b6e6b865beb3bd9e9d799c3dccf7ac00df3b329b73fe8a28 WatchSource:0}: Error finding container b51a14550ffae387b6e6b865beb3bd9e9d799c3dccf7ac00df3b329b73fe8a28: Status 404 returned error can't find the container with id b51a14550ffae387b6e6b865beb3bd9e9d799c3dccf7ac00df3b329b73fe8a28 Feb 02 11:38:59 crc kubenswrapper[4845]: I0202 11:38:59.801614 4845 generic.go:334] "Generic (PLEG): container finished" podID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerID="75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b" exitCode=0 Feb 02 11:38:59 crc kubenswrapper[4845]: I0202 11:38:59.801786 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerDied","Data":"75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b"} Feb 02 11:38:59 crc kubenswrapper[4845]: I0202 11:38:59.802218 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerStarted","Data":"b51a14550ffae387b6e6b865beb3bd9e9d799c3dccf7ac00df3b329b73fe8a28"} Feb 02 11:38:59 crc kubenswrapper[4845]: I0202 11:38:59.807282 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:39:00 crc kubenswrapper[4845]: I0202 11:39:00.817774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerStarted","Data":"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f"} Feb 02 11:39:02 crc kubenswrapper[4845]: I0202 11:39:02.845485 4845 generic.go:334] "Generic (PLEG): container finished" podID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerID="cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f" exitCode=0 Feb 02 11:39:02 crc kubenswrapper[4845]: I0202 11:39:02.845573 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerDied","Data":"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f"} Feb 02 11:39:03 crc kubenswrapper[4845]: I0202 11:39:03.863566 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerStarted","Data":"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5"} Feb 02 11:39:03 crc kubenswrapper[4845]: I0202 11:39:03.889055 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sh64l" podStartSLOduration=2.403569813 podStartE2EDuration="5.889011791s" podCreationTimestamp="2026-02-02 11:38:58 +0000 UTC" firstStartedPulling="2026-02-02 11:38:59.806937113 +0000 UTC m=+4020.898338563" lastFinishedPulling="2026-02-02 11:39:03.292379091 +0000 UTC m=+4024.383780541" observedRunningTime="2026-02-02 11:39:03.888049813 +0000 UTC m=+4024.979451283" watchObservedRunningTime="2026-02-02 11:39:03.889011791 +0000 UTC m=+4024.980413241" Feb 02 11:39:08 crc kubenswrapper[4845]: I0202 11:39:08.686820 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:08 crc kubenswrapper[4845]: I0202 11:39:08.687409 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:09 crc kubenswrapper[4845]: I0202 11:39:09.740484 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-sh64l" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="registry-server" probeResult="failure" output=< Feb 02 11:39:09 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:39:09 crc kubenswrapper[4845]: > Feb 02 11:39:16 crc kubenswrapper[4845]: I0202 11:39:16.237845 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:39:16 crc kubenswrapper[4845]: I0202 11:39:16.238863 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:18 crc kubenswrapper[4845]: I0202 11:39:18.734200 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:18 crc kubenswrapper[4845]: I0202 11:39:18.795372 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:18 crc kubenswrapper[4845]: I0202 11:39:18.971971 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.067086 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sh64l" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="registry-server" containerID="cri-o://073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5" gracePeriod=2 Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.749841 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.878678 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc8fm\" (UniqueName: \"kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm\") pod \"67feb8df-07b7-4752-953d-fa9c66d6504f\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.878995 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content\") pod \"67feb8df-07b7-4752-953d-fa9c66d6504f\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.879130 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities\") pod \"67feb8df-07b7-4752-953d-fa9c66d6504f\" (UID: \"67feb8df-07b7-4752-953d-fa9c66d6504f\") " Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.880355 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities" (OuterVolumeSpecName: "utilities") pod "67feb8df-07b7-4752-953d-fa9c66d6504f" (UID: "67feb8df-07b7-4752-953d-fa9c66d6504f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.882470 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.889553 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm" (OuterVolumeSpecName: "kube-api-access-cc8fm") pod "67feb8df-07b7-4752-953d-fa9c66d6504f" (UID: "67feb8df-07b7-4752-953d-fa9c66d6504f"). InnerVolumeSpecName "kube-api-access-cc8fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.910829 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67feb8df-07b7-4752-953d-fa9c66d6504f" (UID: "67feb8df-07b7-4752-953d-fa9c66d6504f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.984502 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc8fm\" (UniqueName: \"kubernetes.io/projected/67feb8df-07b7-4752-953d-fa9c66d6504f-kube-api-access-cc8fm\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:20 crc kubenswrapper[4845]: I0202 11:39:20.984581 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67feb8df-07b7-4752-953d-fa9c66d6504f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.083963 4845 generic.go:334] "Generic (PLEG): container finished" podID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerID="073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5" exitCode=0 Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.083995 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh64l" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.084044 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerDied","Data":"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5"} Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.085107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh64l" event={"ID":"67feb8df-07b7-4752-953d-fa9c66d6504f","Type":"ContainerDied","Data":"b51a14550ffae387b6e6b865beb3bd9e9d799c3dccf7ac00df3b329b73fe8a28"} Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.085157 4845 scope.go:117] "RemoveContainer" containerID="073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.109847 4845 scope.go:117] "RemoveContainer" containerID="cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.138004 4845 scope.go:117] "RemoveContainer" containerID="75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.144422 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.161172 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh64l"] Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.194427 4845 scope.go:117] "RemoveContainer" containerID="073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5" Feb 02 11:39:21 crc kubenswrapper[4845]: E0202 11:39:21.195025 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5\": container with ID starting with 073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5 not found: ID does not exist" containerID="073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.195119 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5"} err="failed to get container status \"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5\": rpc error: code = NotFound desc = could not find container \"073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5\": container with ID starting with 073226b6e0385de076f5a1c9fa4992d18cbc2ac8701878ac3a64f280f9918ac5 not found: ID does not exist" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.195175 4845 scope.go:117] "RemoveContainer" containerID="cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f" Feb 02 11:39:21 crc kubenswrapper[4845]: E0202 11:39:21.195578 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f\": container with ID starting with cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f not found: ID does not exist" containerID="cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.195616 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f"} err="failed to get container status \"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f\": rpc error: code = NotFound desc = could not find container \"cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f\": container with ID starting with cf3e00d0c3fb340efd3b1bce07e9a00e299f42f63131def5f28c9ed18183f99f not found: ID does not exist" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.195637 4845 scope.go:117] "RemoveContainer" containerID="75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b" Feb 02 11:39:21 crc kubenswrapper[4845]: E0202 11:39:21.196000 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b\": container with ID starting with 75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b not found: ID does not exist" containerID="75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.196034 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b"} err="failed to get container status \"75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b\": rpc error: code = NotFound desc = could not find container \"75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b\": container with ID starting with 75e25bcbde5b6d5477f418d310a09313cb8027d9f6162a225707f9ab5ed36b8b not found: ID does not exist" Feb 02 11:39:21 crc kubenswrapper[4845]: I0202 11:39:21.728318 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" path="/var/lib/kubelet/pods/67feb8df-07b7-4752-953d-fa9c66d6504f/volumes" Feb 02 11:39:46 crc kubenswrapper[4845]: I0202 11:39:46.238183 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:39:46 crc kubenswrapper[4845]: I0202 11:39:46.239185 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.688386 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:07 crc kubenswrapper[4845]: E0202 11:40:07.689563 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="extract-content" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.689583 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="extract-content" Feb 02 11:40:07 crc kubenswrapper[4845]: E0202 11:40:07.689614 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="registry-server" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.689622 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="registry-server" Feb 02 11:40:07 crc kubenswrapper[4845]: E0202 11:40:07.689641 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="extract-utilities" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.689650 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="extract-utilities" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.691905 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="67feb8df-07b7-4752-953d-fa9c66d6504f" containerName="registry-server" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.694213 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.701939 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.704378 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.704464 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjng5\" (UniqueName: \"kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.704516 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.806047 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.806451 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjng5\" (UniqueName: \"kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.806483 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.806691 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.807060 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:07 crc kubenswrapper[4845]: I0202 11:40:07.828148 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjng5\" (UniqueName: \"kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5\") pod \"redhat-operators-266w9\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:08 crc kubenswrapper[4845]: I0202 11:40:08.026847 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:08 crc kubenswrapper[4845]: I0202 11:40:08.606322 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:09 crc kubenswrapper[4845]: I0202 11:40:09.556843 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerID="19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51" exitCode=0 Feb 02 11:40:09 crc kubenswrapper[4845]: I0202 11:40:09.557446 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerDied","Data":"19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51"} Feb 02 11:40:09 crc kubenswrapper[4845]: I0202 11:40:09.557494 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerStarted","Data":"c968d34b73243acb3c2e71d7297b25828170435ef550b0aa3c2183bf30a6c523"} Feb 02 11:40:11 crc kubenswrapper[4845]: I0202 11:40:11.585211 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerStarted","Data":"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185"} Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.237734 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.238514 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.238564 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.239483 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.239558 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0" gracePeriod=600 Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.633034 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0" exitCode=0 Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.633107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0"} Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.633147 4845 scope.go:117] "RemoveContainer" containerID="dd6b3930591fb138d17122481465759b520d0466bb89df22ab0dfde088cbcd5b" Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.635656 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerID="cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185" exitCode=0 Feb 02 11:40:16 crc kubenswrapper[4845]: I0202 11:40:16.635686 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerDied","Data":"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185"} Feb 02 11:40:17 crc kubenswrapper[4845]: I0202 11:40:17.650835 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e"} Feb 02 11:40:17 crc kubenswrapper[4845]: I0202 11:40:17.653677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerStarted","Data":"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829"} Feb 02 11:40:17 crc kubenswrapper[4845]: I0202 11:40:17.688327 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-266w9" podStartSLOduration=3.210041388 podStartE2EDuration="10.688292868s" podCreationTimestamp="2026-02-02 11:40:07 +0000 UTC" firstStartedPulling="2026-02-02 11:40:09.559406606 +0000 UTC m=+4090.650808056" lastFinishedPulling="2026-02-02 11:40:17.037658086 +0000 UTC m=+4098.129059536" observedRunningTime="2026-02-02 11:40:17.685719014 +0000 UTC m=+4098.777120464" watchObservedRunningTime="2026-02-02 11:40:17.688292868 +0000 UTC m=+4098.779694318" Feb 02 11:40:18 crc kubenswrapper[4845]: I0202 11:40:18.027462 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:18 crc kubenswrapper[4845]: I0202 11:40:18.027639 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:19 crc kubenswrapper[4845]: I0202 11:40:19.079772 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-266w9" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="registry-server" probeResult="failure" output=< Feb 02 11:40:19 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:40:19 crc kubenswrapper[4845]: > Feb 02 11:40:28 crc kubenswrapper[4845]: I0202 11:40:28.083341 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:28 crc kubenswrapper[4845]: I0202 11:40:28.134578 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:28 crc kubenswrapper[4845]: I0202 11:40:28.320663 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:29 crc kubenswrapper[4845]: I0202 11:40:29.770378 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-266w9" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="registry-server" containerID="cri-o://9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829" gracePeriod=2 Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.333871 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.364906 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content\") pod \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.365293 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities\") pod \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.365399 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjng5\" (UniqueName: \"kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5\") pod \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\" (UID: \"2d007ebd-6c93-47d1-956b-7e27aab4bf22\") " Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.366437 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities" (OuterVolumeSpecName: "utilities") pod "2d007ebd-6c93-47d1-956b-7e27aab4bf22" (UID: "2d007ebd-6c93-47d1-956b-7e27aab4bf22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.367494 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.382604 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5" (OuterVolumeSpecName: "kube-api-access-jjng5") pod "2d007ebd-6c93-47d1-956b-7e27aab4bf22" (UID: "2d007ebd-6c93-47d1-956b-7e27aab4bf22"). InnerVolumeSpecName "kube-api-access-jjng5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.470797 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjng5\" (UniqueName: \"kubernetes.io/projected/2d007ebd-6c93-47d1-956b-7e27aab4bf22-kube-api-access-jjng5\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.496661 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d007ebd-6c93-47d1-956b-7e27aab4bf22" (UID: "2d007ebd-6c93-47d1-956b-7e27aab4bf22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.573000 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d007ebd-6c93-47d1-956b-7e27aab4bf22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.784178 4845 generic.go:334] "Generic (PLEG): container finished" podID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerID="9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829" exitCode=0 Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.784239 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerDied","Data":"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829"} Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.784277 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-266w9" event={"ID":"2d007ebd-6c93-47d1-956b-7e27aab4bf22","Type":"ContainerDied","Data":"c968d34b73243acb3c2e71d7297b25828170435ef550b0aa3c2183bf30a6c523"} Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.784298 4845 scope.go:117] "RemoveContainer" containerID="9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.785135 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-266w9" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.825089 4845 scope.go:117] "RemoveContainer" containerID="cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.830969 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.851358 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-266w9"] Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.860284 4845 scope.go:117] "RemoveContainer" containerID="19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.927511 4845 scope.go:117] "RemoveContainer" containerID="9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829" Feb 02 11:40:30 crc kubenswrapper[4845]: E0202 11:40:30.928248 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829\": container with ID starting with 9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829 not found: ID does not exist" containerID="9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.928337 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829"} err="failed to get container status \"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829\": rpc error: code = NotFound desc = could not find container \"9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829\": container with ID starting with 9f85f47d7d53b247d2aaa9b2915576ddd222abfe2796babed6a11752fdb51829 not found: ID does not exist" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.928387 4845 scope.go:117] "RemoveContainer" containerID="cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185" Feb 02 11:40:30 crc kubenswrapper[4845]: E0202 11:40:30.928935 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185\": container with ID starting with cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185 not found: ID does not exist" containerID="cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.929102 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185"} err="failed to get container status \"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185\": rpc error: code = NotFound desc = could not find container \"cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185\": container with ID starting with cbdd1b011726d9749fa73a1fbd542abedb6819140f686580ce200cc93604a185 not found: ID does not exist" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.929199 4845 scope.go:117] "RemoveContainer" containerID="19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51" Feb 02 11:40:30 crc kubenswrapper[4845]: E0202 11:40:30.929841 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51\": container with ID starting with 19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51 not found: ID does not exist" containerID="19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51" Feb 02 11:40:30 crc kubenswrapper[4845]: I0202 11:40:30.929899 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51"} err="failed to get container status \"19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51\": rpc error: code = NotFound desc = could not find container \"19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51\": container with ID starting with 19c112365c91ffcc492cc29784664b1d9fbca11452f55521f5b62254fd898f51 not found: ID does not exist" Feb 02 11:40:31 crc kubenswrapper[4845]: I0202 11:40:31.737605 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" path="/var/lib/kubelet/pods/2d007ebd-6c93-47d1-956b-7e27aab4bf22/volumes" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.486930 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:12 crc kubenswrapper[4845]: E0202 11:41:12.488144 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="extract-content" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.488166 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="extract-content" Feb 02 11:41:12 crc kubenswrapper[4845]: E0202 11:41:12.488184 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="registry-server" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.488192 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="registry-server" Feb 02 11:41:12 crc kubenswrapper[4845]: E0202 11:41:12.488208 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="extract-utilities" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.488218 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="extract-utilities" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.488574 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d007ebd-6c93-47d1-956b-7e27aab4bf22" containerName="registry-server" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.490666 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.502944 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.638001 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.638253 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.638400 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr9x9\" (UniqueName: \"kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.741425 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.741546 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.741604 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr9x9\" (UniqueName: \"kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.742086 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.742151 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.773327 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr9x9\" (UniqueName: \"kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9\") pod \"community-operators-9kpk6\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:12 crc kubenswrapper[4845]: I0202 11:41:12.820251 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:13 crc kubenswrapper[4845]: I0202 11:41:13.490626 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:14 crc kubenswrapper[4845]: I0202 11:41:14.257840 4845 generic.go:334] "Generic (PLEG): container finished" podID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerID="5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3" exitCode=0 Feb 02 11:41:14 crc kubenswrapper[4845]: I0202 11:41:14.258406 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerDied","Data":"5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3"} Feb 02 11:41:14 crc kubenswrapper[4845]: I0202 11:41:14.258465 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerStarted","Data":"f859c419b8dc176d5884a145bc04659a1fcf18f030931df90991f91e0ffc55cb"} Feb 02 11:41:16 crc kubenswrapper[4845]: I0202 11:41:16.287705 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerStarted","Data":"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c"} Feb 02 11:41:17 crc kubenswrapper[4845]: I0202 11:41:17.301862 4845 generic.go:334] "Generic (PLEG): container finished" podID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerID="8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c" exitCode=0 Feb 02 11:41:17 crc kubenswrapper[4845]: I0202 11:41:17.301929 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerDied","Data":"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c"} Feb 02 11:41:19 crc kubenswrapper[4845]: I0202 11:41:19.332997 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerStarted","Data":"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3"} Feb 02 11:41:19 crc kubenswrapper[4845]: I0202 11:41:19.360643 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9kpk6" podStartSLOduration=3.567388535 podStartE2EDuration="7.36062426s" podCreationTimestamp="2026-02-02 11:41:12 +0000 UTC" firstStartedPulling="2026-02-02 11:41:14.266133771 +0000 UTC m=+4155.357535221" lastFinishedPulling="2026-02-02 11:41:18.059369496 +0000 UTC m=+4159.150770946" observedRunningTime="2026-02-02 11:41:19.357872071 +0000 UTC m=+4160.449273521" watchObservedRunningTime="2026-02-02 11:41:19.36062426 +0000 UTC m=+4160.452025720" Feb 02 11:41:22 crc kubenswrapper[4845]: I0202 11:41:22.823677 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:22 crc kubenswrapper[4845]: I0202 11:41:22.824303 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:22 crc kubenswrapper[4845]: I0202 11:41:22.876541 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.246802 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.251722 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.276664 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.387222 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.387308 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.387384 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xjpn\" (UniqueName: \"kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.491017 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xjpn\" (UniqueName: \"kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.491370 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.491469 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.492089 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.492131 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.512628 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xjpn\" (UniqueName: \"kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn\") pod \"certified-operators-n48dz\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:29 crc kubenswrapper[4845]: I0202 11:41:29.578121 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:30 crc kubenswrapper[4845]: I0202 11:41:30.180771 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:30 crc kubenswrapper[4845]: W0202 11:41:30.847920 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaabb599d_7f64_43cf_8580_ae0f70c90035.slice/crio-2c6b56543eb6367bcd8aaa40ba7660a352f53072e103c5d049826a8cca5fee21 WatchSource:0}: Error finding container 2c6b56543eb6367bcd8aaa40ba7660a352f53072e103c5d049826a8cca5fee21: Status 404 returned error can't find the container with id 2c6b56543eb6367bcd8aaa40ba7660a352f53072e103c5d049826a8cca5fee21 Feb 02 11:41:31 crc kubenswrapper[4845]: I0202 11:41:31.459264 4845 generic.go:334] "Generic (PLEG): container finished" podID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerID="d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d" exitCode=0 Feb 02 11:41:31 crc kubenswrapper[4845]: I0202 11:41:31.459328 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerDied","Data":"d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d"} Feb 02 11:41:31 crc kubenswrapper[4845]: I0202 11:41:31.460292 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerStarted","Data":"2c6b56543eb6367bcd8aaa40ba7660a352f53072e103c5d049826a8cca5fee21"} Feb 02 11:41:32 crc kubenswrapper[4845]: I0202 11:41:32.880599 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:33 crc kubenswrapper[4845]: I0202 11:41:33.488801 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerStarted","Data":"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390"} Feb 02 11:41:33 crc kubenswrapper[4845]: I0202 11:41:33.630246 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:33 crc kubenswrapper[4845]: I0202 11:41:33.630639 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9kpk6" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="registry-server" containerID="cri-o://7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3" gracePeriod=2 Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.294749 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.478437 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr9x9\" (UniqueName: \"kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9\") pod \"47fee595-38a0-4778-a0f3-e9e4c5004787\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.478796 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content\") pod \"47fee595-38a0-4778-a0f3-e9e4c5004787\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.478836 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities\") pod \"47fee595-38a0-4778-a0f3-e9e4c5004787\" (UID: \"47fee595-38a0-4778-a0f3-e9e4c5004787\") " Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.480170 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities" (OuterVolumeSpecName: "utilities") pod "47fee595-38a0-4778-a0f3-e9e4c5004787" (UID: "47fee595-38a0-4778-a0f3-e9e4c5004787"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.484103 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9" (OuterVolumeSpecName: "kube-api-access-mr9x9") pod "47fee595-38a0-4778-a0f3-e9e4c5004787" (UID: "47fee595-38a0-4778-a0f3-e9e4c5004787"). InnerVolumeSpecName "kube-api-access-mr9x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.502295 4845 generic.go:334] "Generic (PLEG): container finished" podID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerID="7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3" exitCode=0 Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.503032 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9kpk6" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.503464 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerDied","Data":"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3"} Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.503505 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9kpk6" event={"ID":"47fee595-38a0-4778-a0f3-e9e4c5004787","Type":"ContainerDied","Data":"f859c419b8dc176d5884a145bc04659a1fcf18f030931df90991f91e0ffc55cb"} Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.503527 4845 scope.go:117] "RemoveContainer" containerID="7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.542299 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47fee595-38a0-4778-a0f3-e9e4c5004787" (UID: "47fee595-38a0-4778-a0f3-e9e4c5004787"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.561661 4845 scope.go:117] "RemoveContainer" containerID="8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.582136 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr9x9\" (UniqueName: \"kubernetes.io/projected/47fee595-38a0-4778-a0f3-e9e4c5004787-kube-api-access-mr9x9\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.582172 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.582182 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47fee595-38a0-4778-a0f3-e9e4c5004787-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.633302 4845 scope.go:117] "RemoveContainer" containerID="5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.693381 4845 scope.go:117] "RemoveContainer" containerID="7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3" Feb 02 11:41:34 crc kubenswrapper[4845]: E0202 11:41:34.694051 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3\": container with ID starting with 7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3 not found: ID does not exist" containerID="7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.694123 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3"} err="failed to get container status \"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3\": rpc error: code = NotFound desc = could not find container \"7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3\": container with ID starting with 7f4c32c86c81432e75b8b963e39f54cdbd13f2970162f200c9c79831d58966f3 not found: ID does not exist" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.694163 4845 scope.go:117] "RemoveContainer" containerID="8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c" Feb 02 11:41:34 crc kubenswrapper[4845]: E0202 11:41:34.694716 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c\": container with ID starting with 8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c not found: ID does not exist" containerID="8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.694770 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c"} err="failed to get container status \"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c\": rpc error: code = NotFound desc = could not find container \"8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c\": container with ID starting with 8c6b33ba6ec986656edb201decc27f2cdd09c822f842166087490f5082946d2c not found: ID does not exist" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.694812 4845 scope.go:117] "RemoveContainer" containerID="5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3" Feb 02 11:41:34 crc kubenswrapper[4845]: E0202 11:41:34.695161 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3\": container with ID starting with 5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3 not found: ID does not exist" containerID="5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.695198 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3"} err="failed to get container status \"5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3\": rpc error: code = NotFound desc = could not find container \"5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3\": container with ID starting with 5eae645488d5eae73f94ff42e591bcb15170f085ebc2b4587ea7c2c56f1f59b3 not found: ID does not exist" Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.844841 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:34 crc kubenswrapper[4845]: I0202 11:41:34.865850 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9kpk6"] Feb 02 11:41:35 crc kubenswrapper[4845]: I0202 11:41:35.521318 4845 generic.go:334] "Generic (PLEG): container finished" podID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerID="1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390" exitCode=0 Feb 02 11:41:35 crc kubenswrapper[4845]: I0202 11:41:35.521404 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerDied","Data":"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390"} Feb 02 11:41:35 crc kubenswrapper[4845]: I0202 11:41:35.729128 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" path="/var/lib/kubelet/pods/47fee595-38a0-4778-a0f3-e9e4c5004787/volumes" Feb 02 11:41:36 crc kubenswrapper[4845]: I0202 11:41:36.535668 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerStarted","Data":"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7"} Feb 02 11:41:36 crc kubenswrapper[4845]: I0202 11:41:36.569614 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n48dz" podStartSLOduration=3.069253344 podStartE2EDuration="7.569589525s" podCreationTimestamp="2026-02-02 11:41:29 +0000 UTC" firstStartedPulling="2026-02-02 11:41:31.461700511 +0000 UTC m=+4172.553101961" lastFinishedPulling="2026-02-02 11:41:35.962036692 +0000 UTC m=+4177.053438142" observedRunningTime="2026-02-02 11:41:36.554909073 +0000 UTC m=+4177.646310523" watchObservedRunningTime="2026-02-02 11:41:36.569589525 +0000 UTC m=+4177.660990965" Feb 02 11:41:39 crc kubenswrapper[4845]: I0202 11:41:39.578783 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:39 crc kubenswrapper[4845]: I0202 11:41:39.579213 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:39 crc kubenswrapper[4845]: I0202 11:41:39.634126 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:49 crc kubenswrapper[4845]: I0202 11:41:49.629546 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:49 crc kubenswrapper[4845]: I0202 11:41:49.702400 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:49 crc kubenswrapper[4845]: I0202 11:41:49.702794 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n48dz" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="registry-server" containerID="cri-o://858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7" gracePeriod=2 Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.265095 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.301287 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities\") pod \"aabb599d-7f64-43cf-8580-ae0f70c90035\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.301360 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content\") pod \"aabb599d-7f64-43cf-8580-ae0f70c90035\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.301533 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xjpn\" (UniqueName: \"kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn\") pod \"aabb599d-7f64-43cf-8580-ae0f70c90035\" (UID: \"aabb599d-7f64-43cf-8580-ae0f70c90035\") " Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.303001 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities" (OuterVolumeSpecName: "utilities") pod "aabb599d-7f64-43cf-8580-ae0f70c90035" (UID: "aabb599d-7f64-43cf-8580-ae0f70c90035"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.308952 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn" (OuterVolumeSpecName: "kube-api-access-4xjpn") pod "aabb599d-7f64-43cf-8580-ae0f70c90035" (UID: "aabb599d-7f64-43cf-8580-ae0f70c90035"). InnerVolumeSpecName "kube-api-access-4xjpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.378584 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aabb599d-7f64-43cf-8580-ae0f70c90035" (UID: "aabb599d-7f64-43cf-8580-ae0f70c90035"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.405095 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.405133 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aabb599d-7f64-43cf-8580-ae0f70c90035-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.405145 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xjpn\" (UniqueName: \"kubernetes.io/projected/aabb599d-7f64-43cf-8580-ae0f70c90035-kube-api-access-4xjpn\") on node \"crc\" DevicePath \"\"" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.689396 4845 generic.go:334] "Generic (PLEG): container finished" podID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerID="858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7" exitCode=0 Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.689469 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerDied","Data":"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7"} Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.689774 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n48dz" event={"ID":"aabb599d-7f64-43cf-8580-ae0f70c90035","Type":"ContainerDied","Data":"2c6b56543eb6367bcd8aaa40ba7660a352f53072e103c5d049826a8cca5fee21"} Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.689803 4845 scope.go:117] "RemoveContainer" containerID="858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.689497 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n48dz" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.725367 4845 scope.go:117] "RemoveContainer" containerID="1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.734203 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.745513 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n48dz"] Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.751698 4845 scope.go:117] "RemoveContainer" containerID="d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.819649 4845 scope.go:117] "RemoveContainer" containerID="858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7" Feb 02 11:41:50 crc kubenswrapper[4845]: E0202 11:41:50.820087 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7\": container with ID starting with 858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7 not found: ID does not exist" containerID="858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.820124 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7"} err="failed to get container status \"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7\": rpc error: code = NotFound desc = could not find container \"858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7\": container with ID starting with 858e2ad35137cabc57496d24f9faadab9da1de8d79e3279363178d2796ca85c7 not found: ID does not exist" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.820148 4845 scope.go:117] "RemoveContainer" containerID="1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390" Feb 02 11:41:50 crc kubenswrapper[4845]: E0202 11:41:50.820375 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390\": container with ID starting with 1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390 not found: ID does not exist" containerID="1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.820397 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390"} err="failed to get container status \"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390\": rpc error: code = NotFound desc = could not find container \"1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390\": container with ID starting with 1fbe7e6b5331292a7b57c3a153e77040eb16b249e6958ff12337b94b4aeda390 not found: ID does not exist" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.820427 4845 scope.go:117] "RemoveContainer" containerID="d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d" Feb 02 11:41:50 crc kubenswrapper[4845]: E0202 11:41:50.820672 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d\": container with ID starting with d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d not found: ID does not exist" containerID="d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d" Feb 02 11:41:50 crc kubenswrapper[4845]: I0202 11:41:50.820701 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d"} err="failed to get container status \"d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d\": rpc error: code = NotFound desc = could not find container \"d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d\": container with ID starting with d516417ab0b6ce99ddf176a27417c14b42869cbf109dc31e5d52f4a8f17c836d not found: ID does not exist" Feb 02 11:41:51 crc kubenswrapper[4845]: I0202 11:41:51.729322 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" path="/var/lib/kubelet/pods/aabb599d-7f64-43cf-8580-ae0f70c90035/volumes" Feb 02 11:42:46 crc kubenswrapper[4845]: I0202 11:42:46.237369 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:42:46 crc kubenswrapper[4845]: I0202 11:42:46.238056 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:43:16 crc kubenswrapper[4845]: I0202 11:43:16.237786 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:43:16 crc kubenswrapper[4845]: I0202 11:43:16.238419 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.237859 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.238477 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.238556 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.240677 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.240748 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" gracePeriod=600 Feb 02 11:43:46 crc kubenswrapper[4845]: E0202 11:43:46.368177 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.958020 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" exitCode=0 Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.958074 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e"} Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.958117 4845 scope.go:117] "RemoveContainer" containerID="800a02972d6ec813ad234012277534be95b3106d04c712ded96644b0403433e0" Feb 02 11:43:46 crc kubenswrapper[4845]: I0202 11:43:46.959217 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:43:46 crc kubenswrapper[4845]: E0202 11:43:46.959646 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:44:00 crc kubenswrapper[4845]: I0202 11:44:00.712757 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:44:00 crc kubenswrapper[4845]: E0202 11:44:00.713997 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:44:12 crc kubenswrapper[4845]: I0202 11:44:12.714195 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:44:12 crc kubenswrapper[4845]: E0202 11:44:12.715386 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:44:25 crc kubenswrapper[4845]: I0202 11:44:25.713150 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:44:25 crc kubenswrapper[4845]: E0202 11:44:25.714195 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:44:37 crc kubenswrapper[4845]: I0202 11:44:37.714139 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:44:37 crc kubenswrapper[4845]: E0202 11:44:37.716366 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:44:52 crc kubenswrapper[4845]: I0202 11:44:52.714562 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:44:52 crc kubenswrapper[4845]: E0202 11:44:52.716419 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.183957 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d"] Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185209 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185235 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185254 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185260 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185280 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185286 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185299 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185305 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185313 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185319 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4845]: E0202 11:45:00.185377 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185388 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185654 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="47fee595-38a0-4778-a0f3-e9e4c5004787" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.185684 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabb599d-7f64-43cf-8580-ae0f70c90035" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.186761 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.189241 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.191042 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.196915 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d"] Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.217450 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.217658 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqk5\" (UniqueName: \"kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.217771 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.319833 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.319964 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqk5\" (UniqueName: \"kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.320019 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.321051 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.326135 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.340133 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqk5\" (UniqueName: \"kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5\") pod \"collect-profiles-29500545-wz76d\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:00 crc kubenswrapper[4845]: I0202 11:45:00.519542 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:01 crc kubenswrapper[4845]: I0202 11:45:01.041754 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d"] Feb 02 11:45:01 crc kubenswrapper[4845]: I0202 11:45:01.762808 4845 generic.go:334] "Generic (PLEG): container finished" podID="ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" containerID="e84bf218f680128502516aca2b553261d33f0d15f1e0480a845a83cff580ec8e" exitCode=0 Feb 02 11:45:01 crc kubenswrapper[4845]: I0202 11:45:01.762852 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" event={"ID":"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a","Type":"ContainerDied","Data":"e84bf218f680128502516aca2b553261d33f0d15f1e0480a845a83cff580ec8e"} Feb 02 11:45:01 crc kubenswrapper[4845]: I0202 11:45:01.763100 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" event={"ID":"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a","Type":"ContainerStarted","Data":"2952da291d63c09a95bde3fdb6ca7d09f64a7bc5ff5cad303ce2d5892d497cf0"} Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.222671 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.400756 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxqk5\" (UniqueName: \"kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5\") pod \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.400859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume\") pod \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.401160 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume\") pod \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\" (UID: \"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a\") " Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.402182 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" (UID: "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.407557 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" (UID: "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.409201 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5" (OuterVolumeSpecName: "kube-api-access-zxqk5") pod "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" (UID: "ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a"). InnerVolumeSpecName "kube-api-access-zxqk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.504215 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.504255 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxqk5\" (UniqueName: \"kubernetes.io/projected/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-kube-api-access-zxqk5\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.504268 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.789312 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" event={"ID":"ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a","Type":"ContainerDied","Data":"2952da291d63c09a95bde3fdb6ca7d09f64a7bc5ff5cad303ce2d5892d497cf0"} Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.789365 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2952da291d63c09a95bde3fdb6ca7d09f64a7bc5ff5cad303ce2d5892d497cf0" Feb 02 11:45:03 crc kubenswrapper[4845]: I0202 11:45:03.789403 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-wz76d" Feb 02 11:45:04 crc kubenswrapper[4845]: I0202 11:45:04.320448 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb"] Feb 02 11:45:04 crc kubenswrapper[4845]: I0202 11:45:04.331566 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-g5gxb"] Feb 02 11:45:05 crc kubenswrapper[4845]: I0202 11:45:05.729906 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b535e9d-4510-4191-9ab5-768d449b7bc3" path="/var/lib/kubelet/pods/6b535e9d-4510-4191-9ab5-768d449b7bc3/volumes" Feb 02 11:45:07 crc kubenswrapper[4845]: I0202 11:45:07.713771 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:45:07 crc kubenswrapper[4845]: E0202 11:45:07.714432 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:45:11 crc kubenswrapper[4845]: I0202 11:45:11.752269 4845 scope.go:117] "RemoveContainer" containerID="e8b882d2679f84fe117d5aa26326dd7526ca6a2f74a7144d1db5d6a93a707b5a" Feb 02 11:45:19 crc kubenswrapper[4845]: I0202 11:45:19.723319 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:45:19 crc kubenswrapper[4845]: E0202 11:45:19.724223 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:45:30 crc kubenswrapper[4845]: I0202 11:45:30.713531 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:45:30 crc kubenswrapper[4845]: E0202 11:45:30.714585 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:45:45 crc kubenswrapper[4845]: I0202 11:45:45.712741 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:45:45 crc kubenswrapper[4845]: E0202 11:45:45.713567 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:46:00 crc kubenswrapper[4845]: I0202 11:46:00.714328 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:46:00 crc kubenswrapper[4845]: E0202 11:46:00.716072 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:46:13 crc kubenswrapper[4845]: I0202 11:46:13.713138 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:46:13 crc kubenswrapper[4845]: E0202 11:46:13.714077 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:46:26 crc kubenswrapper[4845]: I0202 11:46:26.713997 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:46:26 crc kubenswrapper[4845]: E0202 11:46:26.714955 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:46:41 crc kubenswrapper[4845]: I0202 11:46:41.713606 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:46:41 crc kubenswrapper[4845]: E0202 11:46:41.715138 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:46:53 crc kubenswrapper[4845]: I0202 11:46:53.713397 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:46:53 crc kubenswrapper[4845]: E0202 11:46:53.714301 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:47:06 crc kubenswrapper[4845]: I0202 11:47:06.714142 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:47:06 crc kubenswrapper[4845]: E0202 11:47:06.716003 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:47:17 crc kubenswrapper[4845]: I0202 11:47:17.714293 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:47:17 crc kubenswrapper[4845]: E0202 11:47:17.716834 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:47:30 crc kubenswrapper[4845]: I0202 11:47:30.713210 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:47:30 crc kubenswrapper[4845]: E0202 11:47:30.714031 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:47:41 crc kubenswrapper[4845]: I0202 11:47:41.713902 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:47:41 crc kubenswrapper[4845]: E0202 11:47:41.715007 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:47:53 crc kubenswrapper[4845]: I0202 11:47:53.713158 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:47:53 crc kubenswrapper[4845]: E0202 11:47:53.714095 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:48:07 crc kubenswrapper[4845]: I0202 11:48:07.713001 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:48:07 crc kubenswrapper[4845]: E0202 11:48:07.713745 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:48:19 crc kubenswrapper[4845]: I0202 11:48:19.725532 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:48:19 crc kubenswrapper[4845]: E0202 11:48:19.728182 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:48:32 crc kubenswrapper[4845]: I0202 11:48:32.712699 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:48:32 crc kubenswrapper[4845]: E0202 11:48:32.714760 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:48:47 crc kubenswrapper[4845]: I0202 11:48:47.713298 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:48:48 crc kubenswrapper[4845]: I0202 11:48:48.366813 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5"} Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.562903 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:27 crc kubenswrapper[4845]: E0202 11:49:27.564332 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" containerName="collect-profiles" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.564353 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" containerName="collect-profiles" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.564647 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad9d5873-9ee1-49b9-9cc7-ebed0a0a0c7a" containerName="collect-profiles" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.566919 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.578491 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.649284 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfrxd\" (UniqueName: \"kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.649354 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.649551 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.763331 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.763837 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfrxd\" (UniqueName: \"kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.763904 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.763932 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.764384 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.790816 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfrxd\" (UniqueName: \"kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd\") pod \"redhat-marketplace-zsl9w\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:27 crc kubenswrapper[4845]: I0202 11:49:27.899319 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:28 crc kubenswrapper[4845]: I0202 11:49:28.490407 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:28 crc kubenswrapper[4845]: I0202 11:49:28.780035 4845 generic.go:334] "Generic (PLEG): container finished" podID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerID="ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1" exitCode=0 Feb 02 11:49:28 crc kubenswrapper[4845]: I0202 11:49:28.780107 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerDied","Data":"ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1"} Feb 02 11:49:28 crc kubenswrapper[4845]: I0202 11:49:28.780417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerStarted","Data":"9bbd4f552c2c3759ffc1f28e4eaea61e8145683e0d4c4747293158c6d5f24d0d"} Feb 02 11:49:28 crc kubenswrapper[4845]: I0202 11:49:28.782077 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:49:29 crc kubenswrapper[4845]: I0202 11:49:29.794154 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerStarted","Data":"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26"} Feb 02 11:49:30 crc kubenswrapper[4845]: I0202 11:49:30.806613 4845 generic.go:334] "Generic (PLEG): container finished" podID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerID="958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26" exitCode=0 Feb 02 11:49:30 crc kubenswrapper[4845]: I0202 11:49:30.806836 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerDied","Data":"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26"} Feb 02 11:49:31 crc kubenswrapper[4845]: I0202 11:49:31.819631 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerStarted","Data":"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6"} Feb 02 11:49:31 crc kubenswrapper[4845]: I0202 11:49:31.848687 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zsl9w" podStartSLOduration=2.2663620460000002 podStartE2EDuration="4.848659734s" podCreationTimestamp="2026-02-02 11:49:27 +0000 UTC" firstStartedPulling="2026-02-02 11:49:28.781821568 +0000 UTC m=+4649.873223008" lastFinishedPulling="2026-02-02 11:49:31.364119246 +0000 UTC m=+4652.455520696" observedRunningTime="2026-02-02 11:49:31.842502577 +0000 UTC m=+4652.933904027" watchObservedRunningTime="2026-02-02 11:49:31.848659734 +0000 UTC m=+4652.940061184" Feb 02 11:49:37 crc kubenswrapper[4845]: I0202 11:49:37.900292 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:37 crc kubenswrapper[4845]: I0202 11:49:37.900843 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:37 crc kubenswrapper[4845]: I0202 11:49:37.955068 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:38 crc kubenswrapper[4845]: I0202 11:49:38.954410 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:39 crc kubenswrapper[4845]: I0202 11:49:39.014520 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:40 crc kubenswrapper[4845]: I0202 11:49:40.912967 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zsl9w" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="registry-server" containerID="cri-o://646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6" gracePeriod=2 Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.496648 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.623375 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content\") pod \"3b4186e3-63c2-40ae-8303-6f87b8e32247\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.623904 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfrxd\" (UniqueName: \"kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd\") pod \"3b4186e3-63c2-40ae-8303-6f87b8e32247\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.624014 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities\") pod \"3b4186e3-63c2-40ae-8303-6f87b8e32247\" (UID: \"3b4186e3-63c2-40ae-8303-6f87b8e32247\") " Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.625193 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities" (OuterVolumeSpecName: "utilities") pod "3b4186e3-63c2-40ae-8303-6f87b8e32247" (UID: "3b4186e3-63c2-40ae-8303-6f87b8e32247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.631582 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd" (OuterVolumeSpecName: "kube-api-access-dfrxd") pod "3b4186e3-63c2-40ae-8303-6f87b8e32247" (UID: "3b4186e3-63c2-40ae-8303-6f87b8e32247"). InnerVolumeSpecName "kube-api-access-dfrxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.653676 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b4186e3-63c2-40ae-8303-6f87b8e32247" (UID: "3b4186e3-63c2-40ae-8303-6f87b8e32247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.729237 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.729534 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b4186e3-63c2-40ae-8303-6f87b8e32247-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.729631 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfrxd\" (UniqueName: \"kubernetes.io/projected/3b4186e3-63c2-40ae-8303-6f87b8e32247-kube-api-access-dfrxd\") on node \"crc\" DevicePath \"\"" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.932594 4845 generic.go:334] "Generic (PLEG): container finished" podID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerID="646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6" exitCode=0 Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.932932 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerDied","Data":"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6"} Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.932960 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zsl9w" event={"ID":"3b4186e3-63c2-40ae-8303-6f87b8e32247","Type":"ContainerDied","Data":"9bbd4f552c2c3759ffc1f28e4eaea61e8145683e0d4c4747293158c6d5f24d0d"} Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.932978 4845 scope.go:117] "RemoveContainer" containerID="646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.933153 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zsl9w" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.968764 4845 scope.go:117] "RemoveContainer" containerID="958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26" Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.981436 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:41 crc kubenswrapper[4845]: I0202 11:49:41.990936 4845 scope.go:117] "RemoveContainer" containerID="ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:41.996862 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zsl9w"] Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.053922 4845 scope.go:117] "RemoveContainer" containerID="646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6" Feb 02 11:49:42 crc kubenswrapper[4845]: E0202 11:49:42.056149 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6\": container with ID starting with 646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6 not found: ID does not exist" containerID="646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.056199 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6"} err="failed to get container status \"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6\": rpc error: code = NotFound desc = could not find container \"646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6\": container with ID starting with 646c431ceeeb23b04c98986f17e9845a9be91c82dc4b1423606fc9bcefe75bb6 not found: ID does not exist" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.056228 4845 scope.go:117] "RemoveContainer" containerID="958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26" Feb 02 11:49:42 crc kubenswrapper[4845]: E0202 11:49:42.059116 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26\": container with ID starting with 958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26 not found: ID does not exist" containerID="958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.059155 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26"} err="failed to get container status \"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26\": rpc error: code = NotFound desc = could not find container \"958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26\": container with ID starting with 958770bf08d0e5aadf525e0aa885f2dff0de3e468a1514af21597dbe72339a26 not found: ID does not exist" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.059179 4845 scope.go:117] "RemoveContainer" containerID="ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1" Feb 02 11:49:42 crc kubenswrapper[4845]: E0202 11:49:42.059591 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1\": container with ID starting with ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1 not found: ID does not exist" containerID="ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1" Feb 02 11:49:42 crc kubenswrapper[4845]: I0202 11:49:42.059615 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1"} err="failed to get container status \"ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1\": rpc error: code = NotFound desc = could not find container \"ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1\": container with ID starting with ad5535d576e128d2308fb411ec8c8d2684ef48e47342e89c0a835295b3e0b0a1 not found: ID does not exist" Feb 02 11:49:43 crc kubenswrapper[4845]: I0202 11:49:43.727089 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" path="/var/lib/kubelet/pods/3b4186e3-63c2-40ae-8303-6f87b8e32247/volumes" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.564527 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:24 crc kubenswrapper[4845]: E0202 11:50:24.565726 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="extract-utilities" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.565745 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="extract-utilities" Feb 02 11:50:24 crc kubenswrapper[4845]: E0202 11:50:24.565766 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="registry-server" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.565774 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="registry-server" Feb 02 11:50:24 crc kubenswrapper[4845]: E0202 11:50:24.565797 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="extract-content" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.565805 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="extract-content" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.566182 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4186e3-63c2-40ae-8303-6f87b8e32247" containerName="registry-server" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.568152 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.578828 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.724977 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q9vs\" (UniqueName: \"kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.725162 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.725225 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.827762 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q9vs\" (UniqueName: \"kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.827854 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.827914 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.828664 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.828910 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.855113 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q9vs\" (UniqueName: \"kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs\") pod \"redhat-operators-hsjdb\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:24 crc kubenswrapper[4845]: I0202 11:50:24.928452 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:25 crc kubenswrapper[4845]: I0202 11:50:25.448060 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:26 crc kubenswrapper[4845]: I0202 11:50:26.393753 4845 generic.go:334] "Generic (PLEG): container finished" podID="75a56863-1da1-4828-8e74-01d392b7c313" containerID="403a9328fa4fcfabf4647f93c511ea9eb8de34eced9a1a5d2bb052a8e9e9f78d" exitCode=0 Feb 02 11:50:26 crc kubenswrapper[4845]: I0202 11:50:26.394114 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerDied","Data":"403a9328fa4fcfabf4647f93c511ea9eb8de34eced9a1a5d2bb052a8e9e9f78d"} Feb 02 11:50:26 crc kubenswrapper[4845]: I0202 11:50:26.394407 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerStarted","Data":"2dc8add61b9b40ac8cd1bce94cbf07534556e8498aa890fa3600d1e8cf918dee"} Feb 02 11:50:28 crc kubenswrapper[4845]: I0202 11:50:28.417328 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerStarted","Data":"7f1d76ee71825e87e9057fb9875f9f91dc85c07d5e380783111cfb85e63b2628"} Feb 02 11:50:31 crc kubenswrapper[4845]: I0202 11:50:31.448958 4845 generic.go:334] "Generic (PLEG): container finished" podID="75a56863-1da1-4828-8e74-01d392b7c313" containerID="7f1d76ee71825e87e9057fb9875f9f91dc85c07d5e380783111cfb85e63b2628" exitCode=0 Feb 02 11:50:31 crc kubenswrapper[4845]: I0202 11:50:31.449039 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerDied","Data":"7f1d76ee71825e87e9057fb9875f9f91dc85c07d5e380783111cfb85e63b2628"} Feb 02 11:50:32 crc kubenswrapper[4845]: I0202 11:50:32.461675 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerStarted","Data":"1efc4b956319dcbaebabb1dffd7873abe974a811b1f3b11b91cc34dac1d0755b"} Feb 02 11:50:32 crc kubenswrapper[4845]: I0202 11:50:32.484360 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hsjdb" podStartSLOduration=3.033318439 podStartE2EDuration="8.484338569s" podCreationTimestamp="2026-02-02 11:50:24 +0000 UTC" firstStartedPulling="2026-02-02 11:50:26.396768651 +0000 UTC m=+4707.488170101" lastFinishedPulling="2026-02-02 11:50:31.847788791 +0000 UTC m=+4712.939190231" observedRunningTime="2026-02-02 11:50:32.481750784 +0000 UTC m=+4713.573152234" watchObservedRunningTime="2026-02-02 11:50:32.484338569 +0000 UTC m=+4713.575740029" Feb 02 11:50:34 crc kubenswrapper[4845]: I0202 11:50:34.930036 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:34 crc kubenswrapper[4845]: I0202 11:50:34.930528 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:36 crc kubenswrapper[4845]: I0202 11:50:36.797461 4845 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hsjdb" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="registry-server" probeResult="failure" output=< Feb 02 11:50:36 crc kubenswrapper[4845]: timeout: failed to connect service ":50051" within 1s Feb 02 11:50:36 crc kubenswrapper[4845]: > Feb 02 11:50:44 crc kubenswrapper[4845]: I0202 11:50:44.976349 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:45 crc kubenswrapper[4845]: I0202 11:50:45.028091 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:45 crc kubenswrapper[4845]: I0202 11:50:45.214918 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:46 crc kubenswrapper[4845]: I0202 11:50:46.664987 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hsjdb" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="registry-server" containerID="cri-o://1efc4b956319dcbaebabb1dffd7873abe974a811b1f3b11b91cc34dac1d0755b" gracePeriod=2 Feb 02 11:50:47 crc kubenswrapper[4845]: I0202 11:50:47.677072 4845 generic.go:334] "Generic (PLEG): container finished" podID="75a56863-1da1-4828-8e74-01d392b7c313" containerID="1efc4b956319dcbaebabb1dffd7873abe974a811b1f3b11b91cc34dac1d0755b" exitCode=0 Feb 02 11:50:47 crc kubenswrapper[4845]: I0202 11:50:47.677144 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerDied","Data":"1efc4b956319dcbaebabb1dffd7873abe974a811b1f3b11b91cc34dac1d0755b"} Feb 02 11:50:47 crc kubenswrapper[4845]: I0202 11:50:47.967211 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.054333 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q9vs\" (UniqueName: \"kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs\") pod \"75a56863-1da1-4828-8e74-01d392b7c313\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.054531 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content\") pod \"75a56863-1da1-4828-8e74-01d392b7c313\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.059546 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities\") pod \"75a56863-1da1-4828-8e74-01d392b7c313\" (UID: \"75a56863-1da1-4828-8e74-01d392b7c313\") " Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.061334 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs" (OuterVolumeSpecName: "kube-api-access-8q9vs") pod "75a56863-1da1-4828-8e74-01d392b7c313" (UID: "75a56863-1da1-4828-8e74-01d392b7c313"). InnerVolumeSpecName "kube-api-access-8q9vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.068910 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities" (OuterVolumeSpecName: "utilities") pod "75a56863-1da1-4828-8e74-01d392b7c313" (UID: "75a56863-1da1-4828-8e74-01d392b7c313"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.163278 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q9vs\" (UniqueName: \"kubernetes.io/projected/75a56863-1da1-4828-8e74-01d392b7c313-kube-api-access-8q9vs\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.163556 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.176526 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75a56863-1da1-4828-8e74-01d392b7c313" (UID: "75a56863-1da1-4828-8e74-01d392b7c313"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.266123 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75a56863-1da1-4828-8e74-01d392b7c313-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.689177 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hsjdb" event={"ID":"75a56863-1da1-4828-8e74-01d392b7c313","Type":"ContainerDied","Data":"2dc8add61b9b40ac8cd1bce94cbf07534556e8498aa890fa3600d1e8cf918dee"} Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.689248 4845 scope.go:117] "RemoveContainer" containerID="1efc4b956319dcbaebabb1dffd7873abe974a811b1f3b11b91cc34dac1d0755b" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.690243 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hsjdb" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.711103 4845 scope.go:117] "RemoveContainer" containerID="7f1d76ee71825e87e9057fb9875f9f91dc85c07d5e380783111cfb85e63b2628" Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.733999 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.746450 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hsjdb"] Feb 02 11:50:48 crc kubenswrapper[4845]: I0202 11:50:48.758602 4845 scope.go:117] "RemoveContainer" containerID="403a9328fa4fcfabf4647f93c511ea9eb8de34eced9a1a5d2bb052a8e9e9f78d" Feb 02 11:50:49 crc kubenswrapper[4845]: I0202 11:50:49.727222 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a56863-1da1-4828-8e74-01d392b7c313" path="/var/lib/kubelet/pods/75a56863-1da1-4828-8e74-01d392b7c313/volumes" Feb 02 11:51:16 crc kubenswrapper[4845]: I0202 11:51:16.237878 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:51:16 crc kubenswrapper[4845]: I0202 11:51:16.238517 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.707085 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vx6jc/must-gather-x6jwm"] Feb 02 11:51:34 crc kubenswrapper[4845]: E0202 11:51:34.708441 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="extract-content" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.708462 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="extract-content" Feb 02 11:51:34 crc kubenswrapper[4845]: E0202 11:51:34.708523 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="extract-utilities" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.708545 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="extract-utilities" Feb 02 11:51:34 crc kubenswrapper[4845]: E0202 11:51:34.708560 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="registry-server" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.708574 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="registry-server" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.708918 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a56863-1da1-4828-8e74-01d392b7c313" containerName="registry-server" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.710605 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.714366 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vx6jc"/"default-dockercfg-29z4n" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.714452 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vx6jc"/"openshift-service-ca.crt" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.714535 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vx6jc"/"kube-root-ca.crt" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.721744 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vx6jc/must-gather-x6jwm"] Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.907936 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqd9j\" (UniqueName: \"kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:34 crc kubenswrapper[4845]: I0202 11:51:34.908641 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.011154 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqd9j\" (UniqueName: \"kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.011289 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.011745 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.037580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqd9j\" (UniqueName: \"kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j\") pod \"must-gather-x6jwm\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.048809 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:51:35 crc kubenswrapper[4845]: I0202 11:51:35.609702 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vx6jc/must-gather-x6jwm"] Feb 02 11:51:36 crc kubenswrapper[4845]: I0202 11:51:36.220610 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" event={"ID":"a0d660a9-36eb-4eee-a756-27847f623aa6","Type":"ContainerStarted","Data":"afdebcf3d7e787794c24615db360659e800f3af334354a0e2dfc836ec9e89774"} Feb 02 11:51:41 crc kubenswrapper[4845]: I0202 11:51:41.286513 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" event={"ID":"a0d660a9-36eb-4eee-a756-27847f623aa6","Type":"ContainerStarted","Data":"7aac41ff8df6e02efefc2ff06fc1a8b635ec6f7893e527494a9992604219ab7f"} Feb 02 11:51:41 crc kubenswrapper[4845]: I0202 11:51:41.287048 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" event={"ID":"a0d660a9-36eb-4eee-a756-27847f623aa6","Type":"ContainerStarted","Data":"70dfdbb3165ccda62cfeb30152b80bca1a6f55a7313abd9ca43a514f2c7bab8e"} Feb 02 11:51:41 crc kubenswrapper[4845]: I0202 11:51:41.311389 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" podStartSLOduration=2.59189704 podStartE2EDuration="7.31137213s" podCreationTimestamp="2026-02-02 11:51:34 +0000 UTC" firstStartedPulling="2026-02-02 11:51:35.608270496 +0000 UTC m=+4776.699671956" lastFinishedPulling="2026-02-02 11:51:40.327745596 +0000 UTC m=+4781.419147046" observedRunningTime="2026-02-02 11:51:41.307396275 +0000 UTC m=+4782.398797715" watchObservedRunningTime="2026-02-02 11:51:41.31137213 +0000 UTC m=+4782.402773600" Feb 02 11:51:46 crc kubenswrapper[4845]: I0202 11:51:46.238009 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:51:46 crc kubenswrapper[4845]: I0202 11:51:46.238528 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:51:46 crc kubenswrapper[4845]: I0202 11:51:46.989720 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-ntmts"] Feb 02 11:51:46 crc kubenswrapper[4845]: I0202 11:51:46.991786 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.059479 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.059763 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxq62\" (UniqueName: \"kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.161760 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxq62\" (UniqueName: \"kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.161857 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.162113 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.183580 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxq62\" (UniqueName: \"kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62\") pod \"crc-debug-ntmts\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:47 crc kubenswrapper[4845]: I0202 11:51:47.314085 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:51:48 crc kubenswrapper[4845]: I0202 11:51:48.375073 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" event={"ID":"58d791d0-40ca-48a5-a872-d7fb41e99b11","Type":"ContainerStarted","Data":"346318acde68befdc034a2a7202c4e0ff9451a25850de41e54cbeb1f9e61864a"} Feb 02 11:52:01 crc kubenswrapper[4845]: I0202 11:52:01.557645 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" event={"ID":"58d791d0-40ca-48a5-a872-d7fb41e99b11","Type":"ContainerStarted","Data":"51245b0406ee3556e0c1eaed2454e4cfd8a1b0ad3f05009b2389be64b1574357"} Feb 02 11:52:01 crc kubenswrapper[4845]: I0202 11:52:01.573350 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" podStartSLOduration=2.11817017 podStartE2EDuration="15.573329804s" podCreationTimestamp="2026-02-02 11:51:46 +0000 UTC" firstStartedPulling="2026-02-02 11:51:47.363905088 +0000 UTC m=+4788.455306538" lastFinishedPulling="2026-02-02 11:52:00.819064722 +0000 UTC m=+4801.910466172" observedRunningTime="2026-02-02 11:52:01.570197384 +0000 UTC m=+4802.661598834" watchObservedRunningTime="2026-02-02 11:52:01.573329804 +0000 UTC m=+4802.664731254" Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.237709 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.238369 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.238426 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.239405 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.239460 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5" gracePeriod=600 Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.746984 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5" exitCode=0 Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.747162 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5"} Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.747541 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75"} Feb 02 11:52:16 crc kubenswrapper[4845]: I0202 11:52:16.747565 4845 scope.go:117] "RemoveContainer" containerID="a48dc4e1fa51321313da4366c947dc0243cb3e2da37c626a8f78a36acef3e12e" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.528003 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.531407 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.549551 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.618087 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.618146 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m9l2\" (UniqueName: \"kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.618186 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.720037 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.720088 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m9l2\" (UniqueName: \"kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.720127 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.720652 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.720666 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.749118 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m9l2\" (UniqueName: \"kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2\") pod \"certified-operators-4jj26\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:20 crc kubenswrapper[4845]: I0202 11:52:20.906859 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:21 crc kubenswrapper[4845]: I0202 11:52:21.534871 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:22 crc kubenswrapper[4845]: I0202 11:52:22.839281 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerStarted","Data":"80cedd756d836444310b5362b225836cbe7ca0ed2cdf9e25ed8edf3b6702df56"} Feb 02 11:52:23 crc kubenswrapper[4845]: I0202 11:52:23.855369 4845 generic.go:334] "Generic (PLEG): container finished" podID="58d791d0-40ca-48a5-a872-d7fb41e99b11" containerID="51245b0406ee3556e0c1eaed2454e4cfd8a1b0ad3f05009b2389be64b1574357" exitCode=0 Feb 02 11:52:23 crc kubenswrapper[4845]: I0202 11:52:23.855537 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" event={"ID":"58d791d0-40ca-48a5-a872-d7fb41e99b11","Type":"ContainerDied","Data":"51245b0406ee3556e0c1eaed2454e4cfd8a1b0ad3f05009b2389be64b1574357"} Feb 02 11:52:23 crc kubenswrapper[4845]: I0202 11:52:23.862806 4845 generic.go:334] "Generic (PLEG): container finished" podID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerID="e7521cf920435e8007c22a332efc7e54769be12a28cad279f83023f1bad3f207" exitCode=0 Feb 02 11:52:23 crc kubenswrapper[4845]: I0202 11:52:23.862852 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerDied","Data":"e7521cf920435e8007c22a332efc7e54769be12a28cad279f83023f1bad3f207"} Feb 02 11:52:24 crc kubenswrapper[4845]: I0202 11:52:24.876259 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerStarted","Data":"c652225f3a80666b9b4aaca287f4bb0c217edfe4eb70b50541e6bca686568ca1"} Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.034627 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.066431 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-ntmts"] Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.080879 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-ntmts"] Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.134535 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host\") pod \"58d791d0-40ca-48a5-a872-d7fb41e99b11\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.134638 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host" (OuterVolumeSpecName: "host") pod "58d791d0-40ca-48a5-a872-d7fb41e99b11" (UID: "58d791d0-40ca-48a5-a872-d7fb41e99b11"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.134699 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxq62\" (UniqueName: \"kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62\") pod \"58d791d0-40ca-48a5-a872-d7fb41e99b11\" (UID: \"58d791d0-40ca-48a5-a872-d7fb41e99b11\") " Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.135711 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/58d791d0-40ca-48a5-a872-d7fb41e99b11-host\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.144842 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62" (OuterVolumeSpecName: "kube-api-access-rxq62") pod "58d791d0-40ca-48a5-a872-d7fb41e99b11" (UID: "58d791d0-40ca-48a5-a872-d7fb41e99b11"). InnerVolumeSpecName "kube-api-access-rxq62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.237904 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxq62\" (UniqueName: \"kubernetes.io/projected/58d791d0-40ca-48a5-a872-d7fb41e99b11-kube-api-access-rxq62\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.727608 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58d791d0-40ca-48a5-a872-d7fb41e99b11" path="/var/lib/kubelet/pods/58d791d0-40ca-48a5-a872-d7fb41e99b11/volumes" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.887528 4845 generic.go:334] "Generic (PLEG): container finished" podID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerID="c652225f3a80666b9b4aaca287f4bb0c217edfe4eb70b50541e6bca686568ca1" exitCode=0 Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.887614 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerDied","Data":"c652225f3a80666b9b4aaca287f4bb0c217edfe4eb70b50541e6bca686568ca1"} Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.888777 4845 scope.go:117] "RemoveContainer" containerID="51245b0406ee3556e0c1eaed2454e4cfd8a1b0ad3f05009b2389be64b1574357" Feb 02 11:52:25 crc kubenswrapper[4845]: I0202 11:52:25.888993 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-ntmts" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.359525 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-t28c5"] Feb 02 11:52:26 crc kubenswrapper[4845]: E0202 11:52:26.360206 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58d791d0-40ca-48a5-a872-d7fb41e99b11" containerName="container-00" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.360236 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="58d791d0-40ca-48a5-a872-d7fb41e99b11" containerName="container-00" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.360455 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="58d791d0-40ca-48a5-a872-d7fb41e99b11" containerName="container-00" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.361256 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.464973 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr2nt\" (UniqueName: \"kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.465157 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.567543 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.567668 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.568294 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr2nt\" (UniqueName: \"kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.591768 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr2nt\" (UniqueName: \"kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt\") pod \"crc-debug-t28c5\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.753244 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.911549 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" event={"ID":"c729ce2c-d59b-4e91-84bb-21d018f9f204","Type":"ContainerStarted","Data":"d3a558afe7937dea4c6a2e2ad0ab37289f3b2cedefe41c3b6e12d13b7bd78120"} Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.914066 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerStarted","Data":"ce8812117c5ad002cc231a9a155cf788764ec6017b333803334ede26175d300a"} Feb 02 11:52:26 crc kubenswrapper[4845]: I0202 11:52:26.974389 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4jj26" podStartSLOduration=4.5353691000000005 podStartE2EDuration="6.974361578s" podCreationTimestamp="2026-02-02 11:52:20 +0000 UTC" firstStartedPulling="2026-02-02 11:52:23.865879366 +0000 UTC m=+4824.957280816" lastFinishedPulling="2026-02-02 11:52:26.304871844 +0000 UTC m=+4827.396273294" observedRunningTime="2026-02-02 11:52:26.946976791 +0000 UTC m=+4828.038378261" watchObservedRunningTime="2026-02-02 11:52:26.974361578 +0000 UTC m=+4828.065763048" Feb 02 11:52:27 crc kubenswrapper[4845]: I0202 11:52:27.926400 4845 generic.go:334] "Generic (PLEG): container finished" podID="c729ce2c-d59b-4e91-84bb-21d018f9f204" containerID="9152f3fa80117e002d37c2d1ef281af4c758c28ae16bc74897b372ff63e0c244" exitCode=1 Feb 02 11:52:27 crc kubenswrapper[4845]: I0202 11:52:27.926469 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" event={"ID":"c729ce2c-d59b-4e91-84bb-21d018f9f204","Type":"ContainerDied","Data":"9152f3fa80117e002d37c2d1ef281af4c758c28ae16bc74897b372ff63e0c244"} Feb 02 11:52:27 crc kubenswrapper[4845]: I0202 11:52:27.983375 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-t28c5"] Feb 02 11:52:27 crc kubenswrapper[4845]: I0202 11:52:27.997565 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vx6jc/crc-debug-t28c5"] Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.068497 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.229420 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr2nt\" (UniqueName: \"kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt\") pod \"c729ce2c-d59b-4e91-84bb-21d018f9f204\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.229522 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host\") pod \"c729ce2c-d59b-4e91-84bb-21d018f9f204\" (UID: \"c729ce2c-d59b-4e91-84bb-21d018f9f204\") " Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.230082 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host" (OuterVolumeSpecName: "host") pod "c729ce2c-d59b-4e91-84bb-21d018f9f204" (UID: "c729ce2c-d59b-4e91-84bb-21d018f9f204"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.230393 4845 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c729ce2c-d59b-4e91-84bb-21d018f9f204-host\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.740973 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt" (OuterVolumeSpecName: "kube-api-access-kr2nt") pod "c729ce2c-d59b-4e91-84bb-21d018f9f204" (UID: "c729ce2c-d59b-4e91-84bb-21d018f9f204"). InnerVolumeSpecName "kube-api-access-kr2nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.844828 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr2nt\" (UniqueName: \"kubernetes.io/projected/c729ce2c-d59b-4e91-84bb-21d018f9f204-kube-api-access-kr2nt\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.946388 4845 scope.go:117] "RemoveContainer" containerID="9152f3fa80117e002d37c2d1ef281af4c758c28ae16bc74897b372ff63e0c244" Feb 02 11:52:29 crc kubenswrapper[4845]: I0202 11:52:29.946434 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/crc-debug-t28c5" Feb 02 11:52:30 crc kubenswrapper[4845]: I0202 11:52:30.907214 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:30 crc kubenswrapper[4845]: I0202 11:52:30.907298 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:30 crc kubenswrapper[4845]: I0202 11:52:30.969640 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:31 crc kubenswrapper[4845]: I0202 11:52:31.729442 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c729ce2c-d59b-4e91-84bb-21d018f9f204" path="/var/lib/kubelet/pods/c729ce2c-d59b-4e91-84bb-21d018f9f204/volumes" Feb 02 11:52:34 crc kubenswrapper[4845]: I0202 11:52:34.920638 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:34 crc kubenswrapper[4845]: E0202 11:52:34.921775 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c729ce2c-d59b-4e91-84bb-21d018f9f204" containerName="container-00" Feb 02 11:52:34 crc kubenswrapper[4845]: I0202 11:52:34.921796 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c729ce2c-d59b-4e91-84bb-21d018f9f204" containerName="container-00" Feb 02 11:52:34 crc kubenswrapper[4845]: I0202 11:52:34.922070 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c729ce2c-d59b-4e91-84bb-21d018f9f204" containerName="container-00" Feb 02 11:52:34 crc kubenswrapper[4845]: I0202 11:52:34.923714 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:34 crc kubenswrapper[4845]: I0202 11:52:34.950629 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.076151 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfg77\" (UniqueName: \"kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.076208 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.076325 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.179063 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfg77\" (UniqueName: \"kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.179127 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.179235 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.179755 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.179770 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.200820 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfg77\" (UniqueName: \"kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77\") pod \"community-operators-clx9d\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.265477 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:35 crc kubenswrapper[4845]: I0202 11:52:35.892573 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:35 crc kubenswrapper[4845]: W0202 11:52:35.896383 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359cd7da_7e78_430b_90df_0924f27608c2.slice/crio-15d4c362a1d3553b04d288f58f98c0d8865490035e9beea360d9b69141ab0270 WatchSource:0}: Error finding container 15d4c362a1d3553b04d288f58f98c0d8865490035e9beea360d9b69141ab0270: Status 404 returned error can't find the container with id 15d4c362a1d3553b04d288f58f98c0d8865490035e9beea360d9b69141ab0270 Feb 02 11:52:36 crc kubenswrapper[4845]: I0202 11:52:36.014532 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerStarted","Data":"15d4c362a1d3553b04d288f58f98c0d8865490035e9beea360d9b69141ab0270"} Feb 02 11:52:37 crc kubenswrapper[4845]: I0202 11:52:37.026318 4845 generic.go:334] "Generic (PLEG): container finished" podID="359cd7da-7e78-430b-90df-0924f27608c2" containerID="91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91" exitCode=0 Feb 02 11:52:37 crc kubenswrapper[4845]: I0202 11:52:37.026512 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerDied","Data":"91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91"} Feb 02 11:52:38 crc kubenswrapper[4845]: I0202 11:52:38.038187 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerStarted","Data":"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef"} Feb 02 11:52:39 crc kubenswrapper[4845]: I0202 11:52:39.052108 4845 generic.go:334] "Generic (PLEG): container finished" podID="359cd7da-7e78-430b-90df-0924f27608c2" containerID="3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef" exitCode=0 Feb 02 11:52:39 crc kubenswrapper[4845]: I0202 11:52:39.052301 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerDied","Data":"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef"} Feb 02 11:52:40 crc kubenswrapper[4845]: I0202 11:52:40.073101 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerStarted","Data":"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb"} Feb 02 11:52:40 crc kubenswrapper[4845]: I0202 11:52:40.100898 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clx9d" podStartSLOduration=3.6587866780000002 podStartE2EDuration="6.100871655s" podCreationTimestamp="2026-02-02 11:52:34 +0000 UTC" firstStartedPulling="2026-02-02 11:52:37.029452208 +0000 UTC m=+4838.120853658" lastFinishedPulling="2026-02-02 11:52:39.471537185 +0000 UTC m=+4840.562938635" observedRunningTime="2026-02-02 11:52:40.089843198 +0000 UTC m=+4841.181244648" watchObservedRunningTime="2026-02-02 11:52:40.100871655 +0000 UTC m=+4841.192273125" Feb 02 11:52:41 crc kubenswrapper[4845]: I0202 11:52:41.381120 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:42 crc kubenswrapper[4845]: I0202 11:52:42.296317 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:42 crc kubenswrapper[4845]: I0202 11:52:42.296861 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4jj26" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="registry-server" containerID="cri-o://ce8812117c5ad002cc231a9a155cf788764ec6017b333803334ede26175d300a" gracePeriod=2 Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.104683 4845 generic.go:334] "Generic (PLEG): container finished" podID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerID="ce8812117c5ad002cc231a9a155cf788764ec6017b333803334ede26175d300a" exitCode=0 Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.104879 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerDied","Data":"ce8812117c5ad002cc231a9a155cf788764ec6017b333803334ede26175d300a"} Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.578210 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.711427 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content\") pod \"7d906e33-6090-4cdd-ab5a-749672c65f48\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.711645 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities\") pod \"7d906e33-6090-4cdd-ab5a-749672c65f48\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.711752 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m9l2\" (UniqueName: \"kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2\") pod \"7d906e33-6090-4cdd-ab5a-749672c65f48\" (UID: \"7d906e33-6090-4cdd-ab5a-749672c65f48\") " Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.714180 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities" (OuterVolumeSpecName: "utilities") pod "7d906e33-6090-4cdd-ab5a-749672c65f48" (UID: "7d906e33-6090-4cdd-ab5a-749672c65f48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.720635 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2" (OuterVolumeSpecName: "kube-api-access-2m9l2") pod "7d906e33-6090-4cdd-ab5a-749672c65f48" (UID: "7d906e33-6090-4cdd-ab5a-749672c65f48"). InnerVolumeSpecName "kube-api-access-2m9l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.774053 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d906e33-6090-4cdd-ab5a-749672c65f48" (UID: "7d906e33-6090-4cdd-ab5a-749672c65f48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.817521 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.817783 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m9l2\" (UniqueName: \"kubernetes.io/projected/7d906e33-6090-4cdd-ab5a-749672c65f48-kube-api-access-2m9l2\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:43 crc kubenswrapper[4845]: I0202 11:52:43.817798 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d906e33-6090-4cdd-ab5a-749672c65f48-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.117869 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4jj26" event={"ID":"7d906e33-6090-4cdd-ab5a-749672c65f48","Type":"ContainerDied","Data":"80cedd756d836444310b5362b225836cbe7ca0ed2cdf9e25ed8edf3b6702df56"} Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.117946 4845 scope.go:117] "RemoveContainer" containerID="ce8812117c5ad002cc231a9a155cf788764ec6017b333803334ede26175d300a" Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.118131 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4jj26" Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.145983 4845 scope.go:117] "RemoveContainer" containerID="c652225f3a80666b9b4aaca287f4bb0c217edfe4eb70b50541e6bca686568ca1" Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.169861 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.181537 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4jj26"] Feb 02 11:52:44 crc kubenswrapper[4845]: I0202 11:52:44.183629 4845 scope.go:117] "RemoveContainer" containerID="e7521cf920435e8007c22a332efc7e54769be12a28cad279f83023f1bad3f207" Feb 02 11:52:45 crc kubenswrapper[4845]: I0202 11:52:45.266536 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:45 crc kubenswrapper[4845]: I0202 11:52:45.266580 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:45 crc kubenswrapper[4845]: I0202 11:52:45.315176 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:45 crc kubenswrapper[4845]: I0202 11:52:45.725552 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" path="/var/lib/kubelet/pods/7d906e33-6090-4cdd-ab5a-749672c65f48/volumes" Feb 02 11:52:46 crc kubenswrapper[4845]: I0202 11:52:46.190245 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:46 crc kubenswrapper[4845]: I0202 11:52:46.701597 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:48 crc kubenswrapper[4845]: I0202 11:52:48.217252 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clx9d" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="registry-server" containerID="cri-o://d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb" gracePeriod=2 Feb 02 11:52:48 crc kubenswrapper[4845]: E0202 11:52:48.340547 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359cd7da_7e78_430b_90df_0924f27608c2.slice/crio-conmon-d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod359cd7da_7e78_430b_90df_0924f27608c2.slice/crio-d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:52:48 crc kubenswrapper[4845]: I0202 11:52:48.922595 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.073498 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfg77\" (UniqueName: \"kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77\") pod \"359cd7da-7e78-430b-90df-0924f27608c2\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.073644 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content\") pod \"359cd7da-7e78-430b-90df-0924f27608c2\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.073708 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities\") pod \"359cd7da-7e78-430b-90df-0924f27608c2\" (UID: \"359cd7da-7e78-430b-90df-0924f27608c2\") " Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.075455 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities" (OuterVolumeSpecName: "utilities") pod "359cd7da-7e78-430b-90df-0924f27608c2" (UID: "359cd7da-7e78-430b-90df-0924f27608c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.081205 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77" (OuterVolumeSpecName: "kube-api-access-gfg77") pod "359cd7da-7e78-430b-90df-0924f27608c2" (UID: "359cd7da-7e78-430b-90df-0924f27608c2"). InnerVolumeSpecName "kube-api-access-gfg77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.161784 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "359cd7da-7e78-430b-90df-0924f27608c2" (UID: "359cd7da-7e78-430b-90df-0924f27608c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.177115 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfg77\" (UniqueName: \"kubernetes.io/projected/359cd7da-7e78-430b-90df-0924f27608c2-kube-api-access-gfg77\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.177166 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.177186 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/359cd7da-7e78-430b-90df-0924f27608c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.239282 4845 generic.go:334] "Generic (PLEG): container finished" podID="359cd7da-7e78-430b-90df-0924f27608c2" containerID="d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb" exitCode=0 Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.239335 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerDied","Data":"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb"} Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.239363 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clx9d" event={"ID":"359cd7da-7e78-430b-90df-0924f27608c2","Type":"ContainerDied","Data":"15d4c362a1d3553b04d288f58f98c0d8865490035e9beea360d9b69141ab0270"} Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.239380 4845 scope.go:117] "RemoveContainer" containerID="d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.239558 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clx9d" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.276007 4845 scope.go:117] "RemoveContainer" containerID="3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.285289 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.298107 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clx9d"] Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.299182 4845 scope.go:117] "RemoveContainer" containerID="91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.352064 4845 scope.go:117] "RemoveContainer" containerID="d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb" Feb 02 11:52:49 crc kubenswrapper[4845]: E0202 11:52:49.352514 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb\": container with ID starting with d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb not found: ID does not exist" containerID="d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.352555 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb"} err="failed to get container status \"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb\": rpc error: code = NotFound desc = could not find container \"d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb\": container with ID starting with d7f120c8c90e9d47f3e57ed9e4798938e3d985f7e4c6b455db95590c9b0069fb not found: ID does not exist" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.352584 4845 scope.go:117] "RemoveContainer" containerID="3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef" Feb 02 11:52:49 crc kubenswrapper[4845]: E0202 11:52:49.353024 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef\": container with ID starting with 3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef not found: ID does not exist" containerID="3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.353052 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef"} err="failed to get container status \"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef\": rpc error: code = NotFound desc = could not find container \"3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef\": container with ID starting with 3dddecb4998bcdcbc6205bad5e6fa6f2c13fb363dfb2c7bf32befaa7521e96ef not found: ID does not exist" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.353068 4845 scope.go:117] "RemoveContainer" containerID="91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91" Feb 02 11:52:49 crc kubenswrapper[4845]: E0202 11:52:49.353343 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91\": container with ID starting with 91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91 not found: ID does not exist" containerID="91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.353370 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91"} err="failed to get container status \"91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91\": rpc error: code = NotFound desc = could not find container \"91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91\": container with ID starting with 91cfa1bca0e33f7dbb42899c32bcadfac501f3ebe292d4c890d2cd403f521e91 not found: ID does not exist" Feb 02 11:52:49 crc kubenswrapper[4845]: I0202 11:52:49.726461 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359cd7da-7e78-430b-90df-0924f27608c2" path="/var/lib/kubelet/pods/359cd7da-7e78-430b-90df-0924f27608c2/volumes" Feb 02 11:53:35 crc kubenswrapper[4845]: I0202 11:53:35.634279 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8448c87f86-gdg49_4d926fea-dae3-4818-a608-4d9fa52abef5/barbican-api/0.log" Feb 02 11:53:35 crc kubenswrapper[4845]: I0202 11:53:35.967866 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8448c87f86-gdg49_4d926fea-dae3-4818-a608-4d9fa52abef5/barbican-api-log/0.log" Feb 02 11:53:35 crc kubenswrapper[4845]: I0202 11:53:35.967958 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-555888887b-mbz72_6dea749b-261a-4af3-979a-127dca4af07c/barbican-keystone-listener-log/0.log" Feb 02 11:53:35 crc kubenswrapper[4845]: I0202 11:53:35.978854 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-555888887b-mbz72_6dea749b-261a-4af3-979a-127dca4af07c/barbican-keystone-listener/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.198804 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-954bfc4f9-dfghw_a9ec709e-f840-4ba0-b631-77038f9c5551/barbican-worker/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.213963 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-954bfc4f9-dfghw_a9ec709e-f840-4ba0-b631-77038f9c5551/barbican-worker-log/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.424853 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_813ec32b-5cd3-491d-85ac-bcf0140d0a8f/ceilometer-central-agent/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.455178 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_813ec32b-5cd3-491d-85ac-bcf0140d0a8f/proxy-httpd/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.490859 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_813ec32b-5cd3-491d-85ac-bcf0140d0a8f/sg-core/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.495370 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_813ec32b-5cd3-491d-85ac-bcf0140d0a8f/ceilometer-notification-agent/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.757499 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1800fe94-c9b9-4a5a-963a-75d82a4eab94/cinder-api/0.log" Feb 02 11:53:36 crc kubenswrapper[4845]: I0202 11:53:36.794078 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_1800fe94-c9b9-4a5a-963a-75d82a4eab94/cinder-api-log/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.000575 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3483568c-cbaa-4f63-94e5-36d1a9534d31/cinder-scheduler/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.081306 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3483568c-cbaa-4f63-94e5-36d1a9534d31/probe/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.109961 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-785dh_330a4322-2c1c-4f9a-9093-bfae422cc1fb/init/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.322348 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-785dh_330a4322-2c1c-4f9a-9093-bfae422cc1fb/dnsmasq-dns/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.475417 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b7bbf7cf9-785dh_330a4322-2c1c-4f9a-9093-bfae422cc1fb/init/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.630680 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_80eee60d-7cee-4b29-b022-9f5e8e5d6bdb/glance-httpd/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.788970 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_80eee60d-7cee-4b29-b022-9f5e8e5d6bdb/glance-log/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.868682 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fbdeff72-81f9-4063-8704-d97b21e01b82/glance-httpd/0.log" Feb 02 11:53:37 crc kubenswrapper[4845]: I0202 11:53:37.924575 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_fbdeff72-81f9-4063-8704-d97b21e01b82/glance-log/0.log" Feb 02 11:53:38 crc kubenswrapper[4845]: I0202 11:53:38.606251 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-658dbb4bcd-qn5fs_6cfd78fb-8f69-43d4-9a58-f7e2f5d27958/heat-engine/0.log" Feb 02 11:53:38 crc kubenswrapper[4845]: I0202 11:53:38.663856 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-7998b4fc87-n5g2f_7dfab927-78ef-4105-a07b-a109690fda89/heat-api/0.log" Feb 02 11:53:38 crc kubenswrapper[4845]: I0202 11:53:38.704163 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6dcccd9c6c-tq64l_1225250d-8a00-47d3-acea-856fa864dff5/heat-cfnapi/0.log" Feb 02 11:53:38 crc kubenswrapper[4845]: I0202 11:53:38.916295 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29500501-znk7q_7303667b-89bb-4ad1-92a8-3c94525911d4/keystone-cron/0.log" Feb 02 11:53:38 crc kubenswrapper[4845]: I0202 11:53:38.963739 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c4f9db54b-5v9r8_61e42051-311d-4b4b-af17-e301351d9267/keystone-api/0.log" Feb 02 11:53:39 crc kubenswrapper[4845]: I0202 11:53:39.174384 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f70412f5-a824-45b2-92c2-8e37a25d540a/kube-state-metrics/0.log" Feb 02 11:53:39 crc kubenswrapper[4845]: I0202 11:53:39.484299 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_6704fdd3-f589-4ccd-9a52-4a914e219b09/mysqld-exporter/0.log" Feb 02 11:53:39 crc kubenswrapper[4845]: I0202 11:53:39.713032 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b8db7b6ff-lx6zl_7b4befc3-7f3f-4813-9c5e-9fac28d60f72/neutron-api/0.log" Feb 02 11:53:39 crc kubenswrapper[4845]: I0202 11:53:39.818095 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b8db7b6ff-lx6zl_7b4befc3-7f3f-4813-9c5e-9fac28d60f72/neutron-httpd/0.log" Feb 02 11:53:40 crc kubenswrapper[4845]: I0202 11:53:40.381502 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_953beda6-58f2-45c2-b34e-0cb7db2d3bf6/nova-api-log/0.log" Feb 02 11:53:40 crc kubenswrapper[4845]: I0202 11:53:40.733435 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_af5e3b4b-9a44-4b50-8799-71f869de9028/nova-cell0-conductor-conductor/0.log" Feb 02 11:53:40 crc kubenswrapper[4845]: I0202 11:53:40.740045 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_953beda6-58f2-45c2-b34e-0cb7db2d3bf6/nova-api-api/0.log" Feb 02 11:53:40 crc kubenswrapper[4845]: I0202 11:53:40.840914 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_039d1d72-0f72-4172-a037-ea289c8d7fbb/nova-cell1-conductor-conductor/0.log" Feb 02 11:53:41 crc kubenswrapper[4845]: I0202 11:53:41.311552 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_85bf6fdc-0816-4f80-966c-426f4906c581/nova-cell1-novncproxy-novncproxy/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.037057 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_12adbd4d-efe1-4549-bcac-f2b5f14f18b9/nova-metadata-log/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.074952 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3eed39b-ccd7-4c3d-bbd8-6872503e1c60/nova-scheduler-scheduler/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.274509 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_25ccf740-cc48-4863-8a7d-98548588860f/mysql-bootstrap/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.580018 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_25ccf740-cc48-4863-8a7d-98548588860f/mysql-bootstrap/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.613974 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_25ccf740-cc48-4863-8a7d-98548588860f/galera/0.log" Feb 02 11:53:42 crc kubenswrapper[4845]: I0202 11:53:42.861098 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c7d4707-dfce-464f-bffe-0d543bea6299/mysql-bootstrap/0.log" Feb 02 11:53:43 crc kubenswrapper[4845]: I0202 11:53:43.042544 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c7d4707-dfce-464f-bffe-0d543bea6299/mysql-bootstrap/0.log" Feb 02 11:53:43 crc kubenswrapper[4845]: I0202 11:53:43.096784 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0c7d4707-dfce-464f-bffe-0d543bea6299/galera/0.log" Feb 02 11:53:43 crc kubenswrapper[4845]: I0202 11:53:43.725163 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_12adbd4d-efe1-4549-bcac-f2b5f14f18b9/nova-metadata-metadata/0.log" Feb 02 11:53:43 crc kubenswrapper[4845]: I0202 11:53:43.861657 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_c10a41f9-4bda-4d90-81c1-09ed21f00b2b/openstackclient/0.log" Feb 02 11:53:43 crc kubenswrapper[4845]: I0202 11:53:43.943715 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mqgrd_0f152c8c-6cc4-4586-9fcb-c1ddee6e81d2/openstack-network-exporter/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.165166 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qwr2_4f430e6a-b6ca-42b5-bb37-e5104bba0bd1/ovsdb-server-init/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.416183 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qwr2_4f430e6a-b6ca-42b5-bb37-e5104bba0bd1/ovsdb-server-init/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.417863 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qwr2_4f430e6a-b6ca-42b5-bb37-e5104bba0bd1/ovs-vswitchd/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.457098 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qwr2_4f430e6a-b6ca-42b5-bb37-e5104bba0bd1/ovsdb-server/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.633051 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-tt4db_72da7703-b176-47cb-953e-de037d663c55/ovn-controller/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.745821 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_53989098-3602-4958-96b3-ca7c539c29c9/openstack-network-exporter/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.752501 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_53989098-3602-4958-96b3-ca7c539c29c9/ovn-northd/0.log" Feb 02 11:53:44 crc kubenswrapper[4845]: I0202 11:53:44.950530 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd4a7449-0e37-44e1-9f01-bb1a336cb8cd/openstack-network-exporter/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.243076 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bd4a7449-0e37-44e1-9f01-bb1a336cb8cd/ovsdbserver-nb/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.431828 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67a51964-326b-42cd-8055-0822d42557f7/openstack-network-exporter/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.455069 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_67a51964-326b-42cd-8055-0822d42557f7/ovsdbserver-sb/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.649892 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68f64c64d8-r7nkx_5978920a-e63d-4cb3-accd-4353fb398d50/placement-api/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.880599 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-68f64c64d8-r7nkx_5978920a-e63d-4cb3-accd-4353fb398d50/placement-log/0.log" Feb 02 11:53:45 crc kubenswrapper[4845]: I0202 11:53:45.937206 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_31859db3-3de0-46d0-a81b-b951f1d45279/init-config-reloader/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.062495 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_31859db3-3de0-46d0-a81b-b951f1d45279/config-reloader/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.151504 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_31859db3-3de0-46d0-a81b-b951f1d45279/init-config-reloader/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.164716 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_31859db3-3de0-46d0-a81b-b951f1d45279/thanos-sidecar/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.191956 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_31859db3-3de0-46d0-a81b-b951f1d45279/prometheus/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.415754 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_70739f91-4fde-4bc2-b4e1-5bdb7cb0426c/setup-container/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.642026 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_70739f91-4fde-4bc2-b4e1-5bdb7cb0426c/rabbitmq/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.708921 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e45ad6a-20f4-4da2-82b7-500ed29a0cd5/setup-container/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.753031 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_70739f91-4fde-4bc2-b4e1-5bdb7cb0426c/setup-container/0.log" Feb 02 11:53:46 crc kubenswrapper[4845]: I0202 11:53:46.987742 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e45ad6a-20f4-4da2-82b7-500ed29a0cd5/setup-container/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.070589 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2e45ad6a-20f4-4da2-82b7-500ed29a0cd5/rabbitmq/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.105491 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d0a3a285-364a-4df2-8a7c-947ff673f254/setup-container/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.399543 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d0a3a285-364a-4df2-8a7c-947ff673f254/setup-container/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.443888 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_d0a3a285-364a-4df2-8a7c-947ff673f254/rabbitmq/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.492320 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a61fa08e-868a-4415-88d5-7ed0eebbeb45/setup-container/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.801610 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a61fa08e-868a-4415-88d5-7ed0eebbeb45/setup-container/0.log" Feb 02 11:53:47 crc kubenswrapper[4845]: I0202 11:53:47.852006 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_a61fa08e-868a-4415-88d5-7ed0eebbeb45/rabbitmq/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.108165 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-fwkp8_acbaf357-af6c-46b6-b6f0-de2b6e4ee44c/swift-ring-rebalance/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.110669 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67878d9fbc-npvwk_1d9f4b80-6273-4d77-9309-2ffecc5acc64/proxy-server/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.123003 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-67878d9fbc-npvwk_1d9f4b80-6273-4d77-9309-2ffecc5acc64/proxy-httpd/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.403769 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/account-auditor/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.452160 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/account-reaper/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.480956 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/account-replicator/0.log" Feb 02 11:53:48 crc kubenswrapper[4845]: I0202 11:53:48.910516 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/account-server/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.031771 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/container-auditor/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.069693 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/container-server/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.111136 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/container-replicator/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.271525 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/container-updater/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.305086 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/object-expirer/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.533606 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/object-auditor/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.649545 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/object-replicator/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.706687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/object-updater/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.731104 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/object-server/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.781960 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/rsync/0.log" Feb 02 11:53:49 crc kubenswrapper[4845]: I0202 11:53:49.983097 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d6db6e42-984a-484b-9f90-e6efa9817f37/swift-recon-cron/0.log" Feb 02 11:53:53 crc kubenswrapper[4845]: I0202 11:53:53.432617 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b34640b6-49ff-4638-bde8-1bc32e658907/memcached/0.log" Feb 02 11:54:16 crc kubenswrapper[4845]: I0202 11:54:16.237642 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:54:16 crc kubenswrapper[4845]: I0202 11:54:16.238283 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.264355 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/util/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.573391 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/util/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.604179 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/pull/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.611039 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/pull/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.809122 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/extract/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.815202 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/pull/0.log" Feb 02 11:54:20 crc kubenswrapper[4845]: I0202 11:54:20.825210 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a302fc681e2ed83f5d2c11a7c6c4aafa0a15617c72a70292b42fcaafcfz69tf_08c56222-38b1-47b8-b554-cc59e503ecf0/util/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.055798 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-cfvq7_202de28c-c44a-43d9-98fd-4b34b1dcc65f/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.071286 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-c4jdf_d9196fe1-4a04-44c1-9a5f-1ad5de52da7f/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.274293 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-qfprx_efa2be30-a7d0-4b26-865a-58448de203a0/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.345714 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-9c2wv_85439e8a-f7d3-4e0b-827c-bf27e8cd53dd/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.572546 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-pdfcx_745626d8-548b-43bb-aee8-eeab34a86427/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.580512 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-msjcj_1b72ed0e-9df5-459f-8ca9-de19874a3018/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.971325 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-bc2xw_36101b5e-a4ec-42b8-bb19-1cd2df2897c6/manager/0.log" Feb 02 11:54:21 crc kubenswrapper[4845]: I0202 11:54:21.971687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-m55cc_f3c02aa0-5039-4a4f-ae11-1bac119f7e31/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.152743 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-w55l4_7cc6d028-e9d2-459c-b34c-d069917832a4/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.273270 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-plj9z_de6ca8d3-5e0c-4b7f-b7b5-ce0d9fa4856d/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.412643 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-t89t9_568bf546-0674-4dbd-91d8-9497c682e368/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.480530 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-hnt9f_a1c4a4d1-3974-47c1-9efc-ee88a38e13a5/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.704938 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-p98cd_cac06f19-af65-481d-b739-68375e8d2968/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.790632 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-s97rq_30843195-75a4-4b59-9193-dacda845ace7/manager/0.log" Feb 02 11:54:22 crc kubenswrapper[4845]: I0202 11:54:22.951328 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-59c4b45c4dx44lh_146aa38c-b63c-485a-9c55-006031cfcaa0/manager/0.log" Feb 02 11:54:23 crc kubenswrapper[4845]: I0202 11:54:23.114131 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5649bd689f-k5lt8_e693a9f1-6990-407e-9d01-a23428a6f602/operator/0.log" Feb 02 11:54:23 crc kubenswrapper[4845]: I0202 11:54:23.635244 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-csz6h_153335e1-79de-4c5c-a3cd-2731d0998994/registry-server/0.log" Feb 02 11:54:23 crc kubenswrapper[4845]: I0202 11:54:23.827263 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-lt2wj_eae9c104-9193-4404-b25a-3a47932ef374/manager/0.log" Feb 02 11:54:23 crc kubenswrapper[4845]: I0202 11:54:23.967883 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-9ltr5_70403789-9865-4c4d-a969-118a157e564e/manager/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.188821 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-z5f9l_39f98254-3b87-4ac2-be8c-7d7a0f29d6ce/operator/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.281277 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b7c7bb6c9-k6h5z_bd7f3a0c-1bdf-4673-b657-f56e7040f2a1/manager/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.416783 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-w7bxj_817413ef-6c47-47ec-8e08-8dffd27c1e11/manager/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.642224 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-5f4q4_09ccace8-b972-48ae-a15d-ecf88a300105/manager/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.773545 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6bbb97ddc6-fx4tn_1cec8fc8-2b7b-4332-92a4-05483486f925/manager/0.log" Feb 02 11:54:24 crc kubenswrapper[4845]: I0202 11:54:24.908448 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-mkrpp_5d4eb1a9-137a-4959-9d37-d81ee9c6dd54/manager/0.log" Feb 02 11:54:46 crc kubenswrapper[4845]: I0202 11:54:46.237654 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:54:46 crc kubenswrapper[4845]: I0202 11:54:46.238322 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:49 crc kubenswrapper[4845]: I0202 11:54:49.709355 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-c46tw_722bda9f-5a8b-4c83-8b1f-790da0003ce9/control-plane-machine-set-operator/0.log" Feb 02 11:54:50 crc kubenswrapper[4845]: I0202 11:54:50.187297 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk8gn_6bf70521-8fdf-400f-b7cd-d96b609b4783/kube-rbac-proxy/0.log" Feb 02 11:54:50 crc kubenswrapper[4845]: I0202 11:54:50.228370 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xk8gn_6bf70521-8fdf-400f-b7cd-d96b609b4783/machine-api-operator/0.log" Feb 02 11:55:03 crc kubenswrapper[4845]: I0202 11:55:03.940788 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7p596_c1996e72-3bd0-4770-9662-c0c1359d7a8b/cert-manager-controller/0.log" Feb 02 11:55:04 crc kubenswrapper[4845]: I0202 11:55:04.249120 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-vqsd8_8b99109d-f1ff-4d24-b08a-c317fffd456c/cert-manager-cainjector/0.log" Feb 02 11:55:04 crc kubenswrapper[4845]: I0202 11:55:04.286036 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-ltwq9_7b6c985e-704e-4ff8-b668-d2f4cb218172/cert-manager-webhook/0.log" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.238044 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.238596 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.238640 4845 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.239608 4845 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75"} pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.239664 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" containerID="cri-o://3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" gracePeriod=600 Feb 02 11:55:16 crc kubenswrapper[4845]: E0202 11:55:16.328608 4845 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf2f253_531f_4835_84c1_928680352f7f.slice/crio-conmon-3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:55:16 crc kubenswrapper[4845]: E0202 11:55:16.378692 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.819867 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-2phr4_f49a4fe2-aa60-4d14-a9bb-f13d0066a542/nmstate-console-plugin/0.log" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.845197 4845 generic.go:334] "Generic (PLEG): container finished" podID="ebf2f253-531f-4835-84c1-928680352f7f" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" exitCode=0 Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.845241 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerDied","Data":"3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75"} Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.845285 4845 scope.go:117] "RemoveContainer" containerID="4667d56dc23295df8831dac2a0e953a5f6c9c92d45b3e4effcfbe305f29defc5" Feb 02 11:55:16 crc kubenswrapper[4845]: I0202 11:55:16.846115 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:55:16 crc kubenswrapper[4845]: E0202 11:55:16.846448 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:55:17 crc kubenswrapper[4845]: I0202 11:55:17.109986 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-ks7bq_8c3ff69a-c422-491b-a933-0522f29d7e7c/nmstate-handler/0.log" Feb 02 11:55:17 crc kubenswrapper[4845]: I0202 11:55:17.172595 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-ksh5c_65b8d7a7-4de6-4edc-b652-999572c3494a/kube-rbac-proxy/0.log" Feb 02 11:55:17 crc kubenswrapper[4845]: I0202 11:55:17.275902 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-ksh5c_65b8d7a7-4de6-4edc-b652-999572c3494a/nmstate-metrics/0.log" Feb 02 11:55:17 crc kubenswrapper[4845]: I0202 11:55:17.359860 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-xpndf_17b0c917-994c-41bc-9fbf-6e9d86d65bca/nmstate-operator/0.log" Feb 02 11:55:17 crc kubenswrapper[4845]: I0202 11:55:17.472303 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-k2dv5_ed30c5ac-3449-4902-b948-34958198b224/nmstate-webhook/0.log" Feb 02 11:55:29 crc kubenswrapper[4845]: I0202 11:55:29.721718 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:55:29 crc kubenswrapper[4845]: E0202 11:55:29.722584 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:55:30 crc kubenswrapper[4845]: I0202 11:55:30.418332 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b659b8cd7-mwl8b_2988a5fa-2703-4a60-bcd6-dc81ceea7e1a/manager/0.log" Feb 02 11:55:30 crc kubenswrapper[4845]: I0202 11:55:30.441758 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b659b8cd7-mwl8b_2988a5fa-2703-4a60-bcd6-dc81ceea7e1a/kube-rbac-proxy/0.log" Feb 02 11:55:43 crc kubenswrapper[4845]: I0202 11:55:43.268763 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rmj27_308cfce2-8d47-45e6-9153-a8cd92a8758b/prometheus-operator/0.log" Feb 02 11:55:43 crc kubenswrapper[4845]: I0202 11:55:43.516164 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_d289096b-a35d-4a41-90a3-cab735629cc7/prometheus-operator-admission-webhook/0.log" Feb 02 11:55:43 crc kubenswrapper[4845]: I0202 11:55:43.543713 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413/prometheus-operator-admission-webhook/0.log" Feb 02 11:55:43 crc kubenswrapper[4845]: I0202 11:55:43.900990 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5wvdz_b75686c5-933f-4f8d-bf87-0229795baf12/operator/0.log" Feb 02 11:55:43 crc kubenswrapper[4845]: I0202 11:55:43.929783 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-8x2hl_0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b/observability-ui-dashboards/0.log" Feb 02 11:55:44 crc kubenswrapper[4845]: I0202 11:55:44.059254 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8hhqb_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec/perses-operator/0.log" Feb 02 11:55:44 crc kubenswrapper[4845]: I0202 11:55:44.717425 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:55:44 crc kubenswrapper[4845]: E0202 11:55:44.718204 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:55:57 crc kubenswrapper[4845]: I0202 11:55:57.713364 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:55:57 crc kubenswrapper[4845]: E0202 11:55:57.714048 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:56:00 crc kubenswrapper[4845]: I0202 11:56:00.820584 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-79cf69ddc8-4pzr6_cb944758-f09b-4486-9f3b-4ef87b53246b/cluster-logging-operator/0.log" Feb 02 11:56:00 crc kubenswrapper[4845]: I0202 11:56:00.845372 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-bkwj8_54453df2-b815-42be-9542-aef7eed68aeb/collector/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.025158 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_1f889290-f739-444c-a278-254f68d9d886/loki-compactor/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.106658 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5f678c8dd6-847z7_4af06166-f541-44e7-8b4b-37e4f39a8729/loki-distributor/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.249685 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cf45dcc8c-vr5gw_2b18d0a9-d2cc-4d0b-9ede-a78da13ac929/opa/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.256756 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cf45dcc8c-vr5gw_2b18d0a9-d2cc-4d0b-9ede-a78da13ac929/gateway/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.446840 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cf45dcc8c-wn9nt_1a4ec7d2-3bae-4f70-9a46-e90b067a0518/opa/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.457003 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-cf45dcc8c-wn9nt_1a4ec7d2-3bae-4f70-9a46-e90b067a0518/gateway/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.543362 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_10b5b71f-47de-4ca2-9133-254552173c73/loki-index-gateway/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.730869 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_2d889e99-8118-4f52-ab20-b69a55bec079/loki-ingester/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.743295 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76788598db-sbp94_796c275c-0c9b-4b2e-ba0f-7fbeb645028a/loki-querier/0.log" Feb 02 11:56:01 crc kubenswrapper[4845]: I0202 11:56:01.920457 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-69d9546745-dz4l8_27a684fe-6402-4a0d-ab7c-e5c4eab14a64/loki-query-frontend/0.log" Feb 02 11:56:12 crc kubenswrapper[4845]: I0202 11:56:12.712822 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:56:12 crc kubenswrapper[4845]: E0202 11:56:12.713907 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.107773 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-pwcrt_64760ce4-85d6-4e58-aa77-99c1ca4d936e/kube-rbac-proxy/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.293342 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-pwcrt_64760ce4-85d6-4e58-aa77-99c1ca4d936e/controller/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.403411 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-frr-files/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.575671 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-frr-files/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.597147 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-reloader/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.652436 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-metrics/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.677374 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-reloader/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.814311 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-frr-files/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.840268 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-reloader/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.862449 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-metrics/0.log" Feb 02 11:56:17 crc kubenswrapper[4845]: I0202 11:56:17.888927 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-metrics/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.085687 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-metrics/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.085755 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-reloader/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.104067 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/cp-frr-files/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.124632 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/controller/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.257999 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/frr-metrics/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.288640 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/kube-rbac-proxy/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.347501 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/kube-rbac-proxy-frr/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.580844 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/reloader/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.602811 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-hd78b_8e70dfea-db96-43f0-82ea-e9342326f82f/frr-k8s-webhook-server/0.log" Feb 02 11:56:18 crc kubenswrapper[4845]: I0202 11:56:18.895019 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-bcff8566-gkqml_71926ac8-4fc3-41de-8295-01c8ddbb9d27/manager/0.log" Feb 02 11:56:19 crc kubenswrapper[4845]: I0202 11:56:19.112456 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66c6bb874c-q55bn_d2f82fb6-ff9c-4578-8e8c-2bc454b09927/webhook-server/0.log" Feb 02 11:56:19 crc kubenswrapper[4845]: I0202 11:56:19.129870 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7gchc_ae8d6393-e53b-4acc-9a90-094d95e29c03/kube-rbac-proxy/0.log" Feb 02 11:56:19 crc kubenswrapper[4845]: I0202 11:56:19.693535 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bnlrj_572a79b7-a042-4090-afdd-924cdb0f9d3e/frr/0.log" Feb 02 11:56:19 crc kubenswrapper[4845]: I0202 11:56:19.944222 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7gchc_ae8d6393-e53b-4acc-9a90-094d95e29c03/speaker/0.log" Feb 02 11:56:25 crc kubenswrapper[4845]: I0202 11:56:25.713758 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:56:25 crc kubenswrapper[4845]: E0202 11:56:25.715801 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.105852 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/util/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.309768 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/util/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.318927 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/pull/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.356666 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/pull/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.533005 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/util/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.552149 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/extract/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.575499 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_19f7b28a9b43ae652fc2e0b84ee4ec326dbd0a997d417d0c402b7249a2v66lz_2e68fd72-b961-4a58-9f54-01bd2f6ebd76/pull/0.log" Feb 02 11:56:34 crc kubenswrapper[4845]: I0202 11:56:34.760020 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/util/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.000533 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.040533 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/util/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.064369 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.217423 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/extract/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.217541 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.252879 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc58ffr_2d07b157-761c-4649-ace7-6b9e73636713/util/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.401714 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/util/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.595975 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/util/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.675067 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.677345 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.895262 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/extract/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.908732 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/pull/0.log" Feb 02 11:56:35 crc kubenswrapper[4845]: I0202 11:56:35.922571 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_40d905839fa7263f1f473fab6e11a9af2a700db4f753f3af512410360bc9bxt_8ead5170-aa2d-4a22-a528-02edf1375239/util/0.log" Feb 02 11:56:36 crc kubenswrapper[4845]: I0202 11:56:36.092540 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/util/0.log" Feb 02 11:56:36 crc kubenswrapper[4845]: I0202 11:56:36.226577 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/util/0.log" Feb 02 11:56:36 crc kubenswrapper[4845]: I0202 11:56:36.254878 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/pull/0.log" Feb 02 11:56:36 crc kubenswrapper[4845]: I0202 11:56:36.291637 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/pull/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.027472 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/util/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.028738 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/pull/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.112285 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713md5sq_cbe2dd1b-0b96-4fb7-8873-f9c1378bde92/extract/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.244036 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/util/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.444665 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/pull/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.456510 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/pull/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.466164 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/util/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.723541 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/util/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.746717 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/pull/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.768172 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08p55sw_fe3cf6fe-df9c-4484-a6af-75fe0b5fa907/extract/0.log" Feb 02 11:56:37 crc kubenswrapper[4845]: I0202 11:56:37.925729 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-utilities/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.097638 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-content/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.097736 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-utilities/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.105560 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-content/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.329655 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-utilities/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.330283 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/extract-content/0.log" Feb 02 11:56:38 crc kubenswrapper[4845]: I0202 11:56:38.479677 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-utilities/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.211763 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-skmdg_7b143223-c383-4b6f-b221-c8908e9f93d9/registry-server/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.234346 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-utilities/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.246998 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-content/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.282355 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-content/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.543881 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-content/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.568857 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/extract-utilities/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.699681 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ms22s_9fc452cb-0731-44f6-aae8-bad730786d8a/marketplace-operator/0.log" Feb 02 11:56:39 crc kubenswrapper[4845]: I0202 11:56:39.852532 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-utilities/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.052640 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-utilities/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.087806 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-content/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.118771 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-content/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.364762 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-utilities/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.390329 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k66k5_26334878-6884-4481-b360-96927a5dd3d6/registry-server/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.442459 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/extract-content/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.596035 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-5vfjf_ac22736d-1901-40bf-a17f-186de03c64bf/registry-server/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.622516 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-utilities/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.713460 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:56:40 crc kubenswrapper[4845]: E0202 11:56:40.713779 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.812216 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-content/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.822109 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-utilities/0.log" Feb 02 11:56:40 crc kubenswrapper[4845]: I0202 11:56:40.834876 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-content/0.log" Feb 02 11:56:41 crc kubenswrapper[4845]: I0202 11:56:41.017586 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-utilities/0.log" Feb 02 11:56:41 crc kubenswrapper[4845]: I0202 11:56:41.035405 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/extract-content/0.log" Feb 02 11:56:41 crc kubenswrapper[4845]: I0202 11:56:41.707869 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-575p8_5c039981-931c-408f-8185-4d22b3da04a3/registry-server/0.log" Feb 02 11:56:53 crc kubenswrapper[4845]: I0202 11:56:53.713228 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:56:53 crc kubenswrapper[4845]: E0202 11:56:53.714179 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.155624 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-858f6bffb9-kx466_631a8bcf-3ed5-4bd7-8fcd-2f4a56f47413/prometheus-operator-admission-webhook/0.log" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.171354 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-rmj27_308cfce2-8d47-45e6-9153-a8cd92a8758b/prometheus-operator/0.log" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.199025 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-858f6bffb9-fpgch_d289096b-a35d-4a41-90a3-cab735629cc7/prometheus-operator-admission-webhook/0.log" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.349124 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-8x2hl_0bd36a22-def8-4ad5-b1b2-ac23ef1ea70b/observability-ui-dashboards/0.log" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.377700 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-5wvdz_b75686c5-933f-4f8d-bf87-0229795baf12/operator/0.log" Feb 02 11:56:54 crc kubenswrapper[4845]: I0202 11:56:54.394713 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-8hhqb_1b09e2d3-a6a3-49bd-91c0-f75b4957f2ec/perses-operator/0.log" Feb 02 11:57:04 crc kubenswrapper[4845]: I0202 11:57:04.712794 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:57:04 crc kubenswrapper[4845]: E0202 11:57:04.713541 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:57:08 crc kubenswrapper[4845]: I0202 11:57:08.933269 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b659b8cd7-mwl8b_2988a5fa-2703-4a60-bcd6-dc81ceea7e1a/manager/0.log" Feb 02 11:57:09 crc kubenswrapper[4845]: I0202 11:57:09.006202 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-b659b8cd7-mwl8b_2988a5fa-2703-4a60-bcd6-dc81ceea7e1a/kube-rbac-proxy/0.log" Feb 02 11:57:18 crc kubenswrapper[4845]: I0202 11:57:18.712699 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:57:18 crc kubenswrapper[4845]: E0202 11:57:18.713504 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:57:30 crc kubenswrapper[4845]: I0202 11:57:30.713958 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:57:30 crc kubenswrapper[4845]: E0202 11:57:30.714645 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:57:45 crc kubenswrapper[4845]: I0202 11:57:45.714655 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:57:45 crc kubenswrapper[4845]: E0202 11:57:45.715374 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:57:56 crc kubenswrapper[4845]: I0202 11:57:56.713849 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:57:56 crc kubenswrapper[4845]: E0202 11:57:56.714677 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:09 crc kubenswrapper[4845]: I0202 11:58:09.720867 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:58:09 crc kubenswrapper[4845]: E0202 11:58:09.722192 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:21 crc kubenswrapper[4845]: I0202 11:58:21.716528 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:58:21 crc kubenswrapper[4845]: E0202 11:58:21.718416 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:33 crc kubenswrapper[4845]: I0202 11:58:33.712912 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:58:33 crc kubenswrapper[4845]: E0202 11:58:33.713750 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:45 crc kubenswrapper[4845]: I0202 11:58:45.712806 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:58:45 crc kubenswrapper[4845]: E0202 11:58:45.713735 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:56 crc kubenswrapper[4845]: I0202 11:58:56.713025 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:58:56 crc kubenswrapper[4845]: E0202 11:58:56.713766 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:58:59 crc kubenswrapper[4845]: I0202 11:58:59.208282 4845 generic.go:334] "Generic (PLEG): container finished" podID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerID="70dfdbb3165ccda62cfeb30152b80bca1a6f55a7313abd9ca43a514f2c7bab8e" exitCode=0 Feb 02 11:58:59 crc kubenswrapper[4845]: I0202 11:58:59.208415 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" event={"ID":"a0d660a9-36eb-4eee-a756-27847f623aa6","Type":"ContainerDied","Data":"70dfdbb3165ccda62cfeb30152b80bca1a6f55a7313abd9ca43a514f2c7bab8e"} Feb 02 11:58:59 crc kubenswrapper[4845]: I0202 11:58:59.209364 4845 scope.go:117] "RemoveContainer" containerID="70dfdbb3165ccda62cfeb30152b80bca1a6f55a7313abd9ca43a514f2c7bab8e" Feb 02 11:59:00 crc kubenswrapper[4845]: I0202 11:59:00.011033 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vx6jc_must-gather-x6jwm_a0d660a9-36eb-4eee-a756-27847f623aa6/gather/0.log" Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.120826 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vx6jc/must-gather-x6jwm"] Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.121800 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="copy" containerID="cri-o://7aac41ff8df6e02efefc2ff06fc1a8b635ec6f7893e527494a9992604219ab7f" gracePeriod=2 Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.132070 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vx6jc/must-gather-x6jwm"] Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.310309 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vx6jc_must-gather-x6jwm_a0d660a9-36eb-4eee-a756-27847f623aa6/copy/0.log" Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.310981 4845 generic.go:334] "Generic (PLEG): container finished" podID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerID="7aac41ff8df6e02efefc2ff06fc1a8b635ec6f7893e527494a9992604219ab7f" exitCode=143 Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.720111 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vx6jc_must-gather-x6jwm_a0d660a9-36eb-4eee-a756-27847f623aa6/copy/0.log" Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.724797 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.800711 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqd9j\" (UniqueName: \"kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j\") pod \"a0d660a9-36eb-4eee-a756-27847f623aa6\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.800859 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output\") pod \"a0d660a9-36eb-4eee-a756-27847f623aa6\" (UID: \"a0d660a9-36eb-4eee-a756-27847f623aa6\") " Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.817486 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j" (OuterVolumeSpecName: "kube-api-access-rqd9j") pod "a0d660a9-36eb-4eee-a756-27847f623aa6" (UID: "a0d660a9-36eb-4eee-a756-27847f623aa6"). InnerVolumeSpecName "kube-api-access-rqd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:08 crc kubenswrapper[4845]: I0202 11:59:08.905750 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqd9j\" (UniqueName: \"kubernetes.io/projected/a0d660a9-36eb-4eee-a756-27847f623aa6-kube-api-access-rqd9j\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.011620 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a0d660a9-36eb-4eee-a756-27847f623aa6" (UID: "a0d660a9-36eb-4eee-a756-27847f623aa6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.111400 4845 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a0d660a9-36eb-4eee-a756-27847f623aa6-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.335008 4845 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vx6jc_must-gather-x6jwm_a0d660a9-36eb-4eee-a756-27847f623aa6/copy/0.log" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.338968 4845 scope.go:117] "RemoveContainer" containerID="7aac41ff8df6e02efefc2ff06fc1a8b635ec6f7893e527494a9992604219ab7f" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.339190 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vx6jc/must-gather-x6jwm" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.365346 4845 scope.go:117] "RemoveContainer" containerID="70dfdbb3165ccda62cfeb30152b80bca1a6f55a7313abd9ca43a514f2c7bab8e" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.731491 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" path="/var/lib/kubelet/pods/a0d660a9-36eb-4eee-a756-27847f623aa6/volumes" Feb 02 11:59:09 crc kubenswrapper[4845]: I0202 11:59:09.742673 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:59:09 crc kubenswrapper[4845]: E0202 11:59:09.744239 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:59:21 crc kubenswrapper[4845]: I0202 11:59:21.735676 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:59:21 crc kubenswrapper[4845]: E0202 11:59:21.737409 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:59:32 crc kubenswrapper[4845]: I0202 11:59:32.713586 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:59:32 crc kubenswrapper[4845]: E0202 11:59:32.714645 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.655031 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656144 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="extract-content" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656165 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="extract-content" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656175 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="extract-utilities" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656183 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="extract-utilities" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656196 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="extract-utilities" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656203 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="extract-utilities" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656230 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="copy" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656237 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="copy" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656257 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="extract-content" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656264 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="extract-content" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656275 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="gather" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656282 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="gather" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656294 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656301 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: E0202 11:59:36.656337 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656345 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656630 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d906e33-6090-4cdd-ab5a-749672c65f48" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656648 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="gather" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656666 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d660a9-36eb-4eee-a756-27847f623aa6" containerName="copy" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.656681 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="359cd7da-7e78-430b-90df-0924f27608c2" containerName="registry-server" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.659170 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.683784 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.700191 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr52z\" (UniqueName: \"kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.700477 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.700617 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.802841 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr52z\" (UniqueName: \"kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.802967 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.803013 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.803647 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.803931 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.832077 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr52z\" (UniqueName: \"kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z\") pod \"redhat-marketplace-7sp74\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:36 crc kubenswrapper[4845]: I0202 11:59:36.988155 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:38 crc kubenswrapper[4845]: I0202 11:59:38.104788 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:38 crc kubenswrapper[4845]: I0202 11:59:38.657304 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerID="0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2" exitCode=0 Feb 02 11:59:38 crc kubenswrapper[4845]: I0202 11:59:38.657348 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerDied","Data":"0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2"} Feb 02 11:59:38 crc kubenswrapper[4845]: I0202 11:59:38.657639 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerStarted","Data":"ae0be742560de8e985f7d9c6a63dbcc6970fbeb39be0d5d9d0f0e94563fc6946"} Feb 02 11:59:38 crc kubenswrapper[4845]: I0202 11:59:38.659266 4845 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:59:40 crc kubenswrapper[4845]: I0202 11:59:40.685441 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerID="3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581" exitCode=0 Feb 02 11:59:40 crc kubenswrapper[4845]: I0202 11:59:40.685579 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerDied","Data":"3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581"} Feb 02 11:59:41 crc kubenswrapper[4845]: I0202 11:59:41.707417 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerStarted","Data":"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b"} Feb 02 11:59:41 crc kubenswrapper[4845]: I0202 11:59:41.738988 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7sp74" podStartSLOduration=3.305581476 podStartE2EDuration="5.738967055s" podCreationTimestamp="2026-02-02 11:59:36 +0000 UTC" firstStartedPulling="2026-02-02 11:59:38.659006617 +0000 UTC m=+5259.750408087" lastFinishedPulling="2026-02-02 11:59:41.092392206 +0000 UTC m=+5262.183793666" observedRunningTime="2026-02-02 11:59:41.728201167 +0000 UTC m=+5262.819602617" watchObservedRunningTime="2026-02-02 11:59:41.738967055 +0000 UTC m=+5262.830368505" Feb 02 11:59:46 crc kubenswrapper[4845]: I0202 11:59:46.988612 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:46 crc kubenswrapper[4845]: I0202 11:59:46.996764 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:47 crc kubenswrapper[4845]: I0202 11:59:47.056455 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:47 crc kubenswrapper[4845]: I0202 11:59:47.713153 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:59:47 crc kubenswrapper[4845]: E0202 11:59:47.713692 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 11:59:47 crc kubenswrapper[4845]: I0202 11:59:47.857835 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:47 crc kubenswrapper[4845]: I0202 11:59:47.911785 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:49 crc kubenswrapper[4845]: I0202 11:59:49.814214 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7sp74" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="registry-server" containerID="cri-o://aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b" gracePeriod=2 Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.272749 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.388410 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content\") pod \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.388930 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities\") pod \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.388998 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr52z\" (UniqueName: \"kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z\") pod \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\" (UID: \"c2a84c41-3b2c-4c19-835c-e4c499d17fd6\") " Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.390313 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities" (OuterVolumeSpecName: "utilities") pod "c2a84c41-3b2c-4c19-835c-e4c499d17fd6" (UID: "c2a84c41-3b2c-4c19-835c-e4c499d17fd6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.398782 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z" (OuterVolumeSpecName: "kube-api-access-fr52z") pod "c2a84c41-3b2c-4c19-835c-e4c499d17fd6" (UID: "c2a84c41-3b2c-4c19-835c-e4c499d17fd6"). InnerVolumeSpecName "kube-api-access-fr52z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.411253 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2a84c41-3b2c-4c19-835c-e4c499d17fd6" (UID: "c2a84c41-3b2c-4c19-835c-e4c499d17fd6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.492300 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.492340 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.492358 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr52z\" (UniqueName: \"kubernetes.io/projected/c2a84c41-3b2c-4c19-835c-e4c499d17fd6-kube-api-access-fr52z\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.828217 4845 generic.go:334] "Generic (PLEG): container finished" podID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerID="aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b" exitCode=0 Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.828304 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerDied","Data":"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b"} Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.828364 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7sp74" event={"ID":"c2a84c41-3b2c-4c19-835c-e4c499d17fd6","Type":"ContainerDied","Data":"ae0be742560de8e985f7d9c6a63dbcc6970fbeb39be0d5d9d0f0e94563fc6946"} Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.828394 4845 scope.go:117] "RemoveContainer" containerID="aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.828343 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7sp74" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.850415 4845 scope.go:117] "RemoveContainer" containerID="3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581" Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.883467 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:50 crc kubenswrapper[4845]: I0202 11:59:50.896713 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7sp74"] Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.057055 4845 scope.go:117] "RemoveContainer" containerID="0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.212699 4845 scope.go:117] "RemoveContainer" containerID="aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b" Feb 02 11:59:51 crc kubenswrapper[4845]: E0202 11:59:51.213286 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b\": container with ID starting with aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b not found: ID does not exist" containerID="aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.213327 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b"} err="failed to get container status \"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b\": rpc error: code = NotFound desc = could not find container \"aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b\": container with ID starting with aca0b5c0675ee6dc221c9a457393cdcd574af5330434a45f36dcedf8c37df25b not found: ID does not exist" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.213358 4845 scope.go:117] "RemoveContainer" containerID="3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581" Feb 02 11:59:51 crc kubenswrapper[4845]: E0202 11:59:51.213649 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581\": container with ID starting with 3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581 not found: ID does not exist" containerID="3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.213694 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581"} err="failed to get container status \"3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581\": rpc error: code = NotFound desc = could not find container \"3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581\": container with ID starting with 3a8680cd74b69b425a58f79f2b7b7058593435c91a1fa6335d1f8e160ed99581 not found: ID does not exist" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.213719 4845 scope.go:117] "RemoveContainer" containerID="0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2" Feb 02 11:59:51 crc kubenswrapper[4845]: E0202 11:59:51.214036 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2\": container with ID starting with 0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2 not found: ID does not exist" containerID="0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.214061 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2"} err="failed to get container status \"0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2\": rpc error: code = NotFound desc = could not find container \"0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2\": container with ID starting with 0b279424a3b9c4c386a448412312fe3bf0c33b7a092a4d78897a79bfeb62b2c2 not found: ID does not exist" Feb 02 11:59:51 crc kubenswrapper[4845]: I0202 11:59:51.731353 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" path="/var/lib/kubelet/pods/c2a84c41-3b2c-4c19-835c-e4c499d17fd6/volumes" Feb 02 11:59:59 crc kubenswrapper[4845]: I0202 11:59:59.723212 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 11:59:59 crc kubenswrapper[4845]: E0202 11:59:59.724586 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.163185 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr"] Feb 02 12:00:00 crc kubenswrapper[4845]: E0202 12:00:00.163978 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="extract-content" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.164008 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="extract-content" Feb 02 12:00:00 crc kubenswrapper[4845]: E0202 12:00:00.164038 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="registry-server" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.164049 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="registry-server" Feb 02 12:00:00 crc kubenswrapper[4845]: E0202 12:00:00.164094 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="extract-utilities" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.164107 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="extract-utilities" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.164428 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a84c41-3b2c-4c19-835c-e4c499d17fd6" containerName="registry-server" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.165634 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.169954 4845 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.170255 4845 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.179173 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr"] Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.244337 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.244727 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wktkr\" (UniqueName: \"kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.245106 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.346965 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.347091 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.347115 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wktkr\" (UniqueName: \"kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.349271 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.353732 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.365520 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wktkr\" (UniqueName: \"kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr\") pod \"collect-profiles-29500560-fhdkr\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.504629 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:00 crc kubenswrapper[4845]: I0202 12:00:00.996578 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr"] Feb 02 12:00:01 crc kubenswrapper[4845]: I0202 12:00:01.950604 4845 generic.go:334] "Generic (PLEG): container finished" podID="6c522af1-b700-4069-82b9-0c84cb693b9e" containerID="a9a1bdd86fe1a1a40b5e9dda368e1bfc9c7b0c92c27d045a527e9d73e2c3b7f9" exitCode=0 Feb 02 12:00:01 crc kubenswrapper[4845]: I0202 12:00:01.950679 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" event={"ID":"6c522af1-b700-4069-82b9-0c84cb693b9e","Type":"ContainerDied","Data":"a9a1bdd86fe1a1a40b5e9dda368e1bfc9c7b0c92c27d045a527e9d73e2c3b7f9"} Feb 02 12:00:01 crc kubenswrapper[4845]: I0202 12:00:01.950944 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" event={"ID":"6c522af1-b700-4069-82b9-0c84cb693b9e","Type":"ContainerStarted","Data":"86c6f078dded756def1df3f35c0baebc412526d92b3ff644a7b9e797bd8e4719"} Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.359183 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.439772 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume\") pod \"6c522af1-b700-4069-82b9-0c84cb693b9e\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.439999 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wktkr\" (UniqueName: \"kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr\") pod \"6c522af1-b700-4069-82b9-0c84cb693b9e\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.440126 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume\") pod \"6c522af1-b700-4069-82b9-0c84cb693b9e\" (UID: \"6c522af1-b700-4069-82b9-0c84cb693b9e\") " Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.440805 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c522af1-b700-4069-82b9-0c84cb693b9e" (UID: "6c522af1-b700-4069-82b9-0c84cb693b9e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.441856 4845 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c522af1-b700-4069-82b9-0c84cb693b9e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.448182 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c522af1-b700-4069-82b9-0c84cb693b9e" (UID: "6c522af1-b700-4069-82b9-0c84cb693b9e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.451463 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr" (OuterVolumeSpecName: "kube-api-access-wktkr") pod "6c522af1-b700-4069-82b9-0c84cb693b9e" (UID: "6c522af1-b700-4069-82b9-0c84cb693b9e"). InnerVolumeSpecName "kube-api-access-wktkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.544253 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wktkr\" (UniqueName: \"kubernetes.io/projected/6c522af1-b700-4069-82b9-0c84cb693b9e-kube-api-access-wktkr\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.544434 4845 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c522af1-b700-4069-82b9-0c84cb693b9e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.973355 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" event={"ID":"6c522af1-b700-4069-82b9-0c84cb693b9e","Type":"ContainerDied","Data":"86c6f078dded756def1df3f35c0baebc412526d92b3ff644a7b9e797bd8e4719"} Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.973397 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86c6f078dded756def1df3f35c0baebc412526d92b3ff644a7b9e797bd8e4719" Feb 02 12:00:03 crc kubenswrapper[4845]: I0202 12:00:03.973679 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-fhdkr" Feb 02 12:00:04 crc kubenswrapper[4845]: I0202 12:00:04.436339 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn"] Feb 02 12:00:04 crc kubenswrapper[4845]: I0202 12:00:04.445641 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-lbbxn"] Feb 02 12:00:05 crc kubenswrapper[4845]: I0202 12:00:05.727903 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe7b56f-4954-457d-8bb8-a0a50096cfb9" path="/var/lib/kubelet/pods/dfe7b56f-4954-457d-8bb8-a0a50096cfb9/volumes" Feb 02 12:00:12 crc kubenswrapper[4845]: I0202 12:00:12.445605 4845 scope.go:117] "RemoveContainer" containerID="7f56eeae3b5e853cc5e5d1ab4c3fe6f56e0c913955d2cd95163f2033cd7e1417" Feb 02 12:00:14 crc kubenswrapper[4845]: I0202 12:00:14.713471 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 12:00:14 crc kubenswrapper[4845]: E0202 12:00:14.714162 4845 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2wnn9_openshift-machine-config-operator(ebf2f253-531f-4835-84c1-928680352f7f)\"" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" Feb 02 12:00:25 crc kubenswrapper[4845]: I0202 12:00:25.713642 4845 scope.go:117] "RemoveContainer" containerID="3a4a1d82ec8333e6a28c6362b40eb2696c27da7912949e706323c4e360c22f75" Feb 02 12:00:27 crc kubenswrapper[4845]: I0202 12:00:27.209319 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" event={"ID":"ebf2f253-531f-4835-84c1-928680352f7f","Type":"ContainerStarted","Data":"4b9bcae9c88976b116a4c6a1c0683c93187733cbf529aca20585d37c96e7fd95"} Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.164819 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500561-gzmjj"] Feb 02 12:01:00 crc kubenswrapper[4845]: E0202 12:01:00.166677 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c522af1-b700-4069-82b9-0c84cb693b9e" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.166712 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c522af1-b700-4069-82b9-0c84cb693b9e" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.167334 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c522af1-b700-4069-82b9-0c84cb693b9e" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.169341 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.177967 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500561-gzmjj"] Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.302741 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.303208 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.303536 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.303675 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8dxk\" (UniqueName: \"kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.406086 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.406247 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.406359 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.406418 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8dxk\" (UniqueName: \"kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.413051 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.413517 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.419080 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.427223 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8dxk\" (UniqueName: \"kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk\") pod \"keystone-cron-29500561-gzmjj\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.496148 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:00 crc kubenswrapper[4845]: I0202 12:01:00.993447 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500561-gzmjj"] Feb 02 12:01:01 crc kubenswrapper[4845]: W0202 12:01:01.000594 4845 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabddecef_869d_410a_9658_52b8eb816fd7.slice/crio-41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865 WatchSource:0}: Error finding container 41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865: Status 404 returned error can't find the container with id 41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865 Feb 02 12:01:01 crc kubenswrapper[4845]: I0202 12:01:01.539966 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-gzmjj" event={"ID":"abddecef-869d-410a-9658-52b8eb816fd7","Type":"ContainerStarted","Data":"446b47f81c8aef222c759e53b536fd0c7ed3763c3fb014aa5772153f55483f07"} Feb 02 12:01:01 crc kubenswrapper[4845]: I0202 12:01:01.540274 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-gzmjj" event={"ID":"abddecef-869d-410a-9658-52b8eb816fd7","Type":"ContainerStarted","Data":"41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865"} Feb 02 12:01:01 crc kubenswrapper[4845]: I0202 12:01:01.562711 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500561-gzmjj" podStartSLOduration=1.562689113 podStartE2EDuration="1.562689113s" podCreationTimestamp="2026-02-02 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:01.556050163 +0000 UTC m=+5342.647451613" watchObservedRunningTime="2026-02-02 12:01:01.562689113 +0000 UTC m=+5342.654090563" Feb 02 12:01:05 crc kubenswrapper[4845]: I0202 12:01:05.581587 4845 generic.go:334] "Generic (PLEG): container finished" podID="abddecef-869d-410a-9658-52b8eb816fd7" containerID="446b47f81c8aef222c759e53b536fd0c7ed3763c3fb014aa5772153f55483f07" exitCode=0 Feb 02 12:01:05 crc kubenswrapper[4845]: I0202 12:01:05.581677 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-gzmjj" event={"ID":"abddecef-869d-410a-9658-52b8eb816fd7","Type":"ContainerDied","Data":"446b47f81c8aef222c759e53b536fd0c7ed3763c3fb014aa5772153f55483f07"} Feb 02 12:01:06 crc kubenswrapper[4845]: I0202 12:01:06.979713 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.076520 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8dxk\" (UniqueName: \"kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk\") pod \"abddecef-869d-410a-9658-52b8eb816fd7\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.076923 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys\") pod \"abddecef-869d-410a-9658-52b8eb816fd7\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.077072 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data\") pod \"abddecef-869d-410a-9658-52b8eb816fd7\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.077420 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle\") pod \"abddecef-869d-410a-9658-52b8eb816fd7\" (UID: \"abddecef-869d-410a-9658-52b8eb816fd7\") " Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.082675 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk" (OuterVolumeSpecName: "kube-api-access-h8dxk") pod "abddecef-869d-410a-9658-52b8eb816fd7" (UID: "abddecef-869d-410a-9658-52b8eb816fd7"). InnerVolumeSpecName "kube-api-access-h8dxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.085722 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "abddecef-869d-410a-9658-52b8eb816fd7" (UID: "abddecef-869d-410a-9658-52b8eb816fd7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.109063 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abddecef-869d-410a-9658-52b8eb816fd7" (UID: "abddecef-869d-410a-9658-52b8eb816fd7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.134452 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data" (OuterVolumeSpecName: "config-data") pod "abddecef-869d-410a-9658-52b8eb816fd7" (UID: "abddecef-869d-410a-9658-52b8eb816fd7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.181279 4845 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.181507 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8dxk\" (UniqueName: \"kubernetes.io/projected/abddecef-869d-410a-9658-52b8eb816fd7-kube-api-access-h8dxk\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.181595 4845 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.181647 4845 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abddecef-869d-410a-9658-52b8eb816fd7-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.604051 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-gzmjj" event={"ID":"abddecef-869d-410a-9658-52b8eb816fd7","Type":"ContainerDied","Data":"41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865"} Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.604096 4845 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f7fb5ab2f958c2d29c01a42788d8c823094bd7f1c2de8ec61b848eb9995865" Feb 02 12:01:07 crc kubenswrapper[4845]: I0202 12:01:07.604418 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-gzmjj" Feb 02 12:02:46 crc kubenswrapper[4845]: I0202 12:02:46.237285 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:02:46 crc kubenswrapper[4845]: I0202 12:02:46.237880 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:03:16 crc kubenswrapper[4845]: I0202 12:03:16.237408 4845 patch_prober.go:28] interesting pod/machine-config-daemon-2wnn9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:03:16 crc kubenswrapper[4845]: I0202 12:03:16.237931 4845 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2wnn9" podUID="ebf2f253-531f-4835-84c1-928680352f7f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:03:24 crc kubenswrapper[4845]: I0202 12:03:24.952406 4845 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:24 crc kubenswrapper[4845]: E0202 12:03:24.956381 4845 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abddecef-869d-410a-9658-52b8eb816fd7" containerName="keystone-cron" Feb 02 12:03:24 crc kubenswrapper[4845]: I0202 12:03:24.956424 4845 state_mem.go:107] "Deleted CPUSet assignment" podUID="abddecef-869d-410a-9658-52b8eb816fd7" containerName="keystone-cron" Feb 02 12:03:24 crc kubenswrapper[4845]: I0202 12:03:24.956959 4845 memory_manager.go:354] "RemoveStaleState removing state" podUID="abddecef-869d-410a-9658-52b8eb816fd7" containerName="keystone-cron" Feb 02 12:03:24 crc kubenswrapper[4845]: I0202 12:03:24.975153 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:24 crc kubenswrapper[4845]: I0202 12:03:24.984565 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.037227 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qf8\" (UniqueName: \"kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.037433 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.037518 4845 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.140762 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.141256 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.141543 4845 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qf8\" (UniqueName: \"kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.141884 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.144147 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.166382 4845 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qf8\" (UniqueName: \"kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8\") pod \"certified-operators-wxhjh\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:25 crc kubenswrapper[4845]: I0202 12:03:25.308622 4845 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:26 crc kubenswrapper[4845]: I0202 12:03:26.048311 4845 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:26 crc kubenswrapper[4845]: I0202 12:03:26.097742 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerStarted","Data":"58b4b61f11c41e7a9802231b94ce0f3773ae77afd46dde6a7945c82fbc472073"} Feb 02 12:03:27 crc kubenswrapper[4845]: I0202 12:03:27.112720 4845 generic.go:334] "Generic (PLEG): container finished" podID="0cce6e08-6894-471c-8db2-1846df2e73bb" containerID="9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c" exitCode=0 Feb 02 12:03:27 crc kubenswrapper[4845]: I0202 12:03:27.113016 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerDied","Data":"9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c"} Feb 02 12:03:28 crc kubenswrapper[4845]: I0202 12:03:28.128280 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerStarted","Data":"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89"} Feb 02 12:03:30 crc kubenswrapper[4845]: I0202 12:03:30.150632 4845 generic.go:334] "Generic (PLEG): container finished" podID="0cce6e08-6894-471c-8db2-1846df2e73bb" containerID="84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89" exitCode=0 Feb 02 12:03:30 crc kubenswrapper[4845]: I0202 12:03:30.150684 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerDied","Data":"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89"} Feb 02 12:03:32 crc kubenswrapper[4845]: I0202 12:03:32.175630 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerStarted","Data":"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242"} Feb 02 12:03:32 crc kubenswrapper[4845]: I0202 12:03:32.209570 4845 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wxhjh" podStartSLOduration=4.616602257 podStartE2EDuration="8.209545021s" podCreationTimestamp="2026-02-02 12:03:24 +0000 UTC" firstStartedPulling="2026-02-02 12:03:27.117503277 +0000 UTC m=+5488.208904727" lastFinishedPulling="2026-02-02 12:03:30.710446041 +0000 UTC m=+5491.801847491" observedRunningTime="2026-02-02 12:03:32.198955918 +0000 UTC m=+5493.290357378" watchObservedRunningTime="2026-02-02 12:03:32.209545021 +0000 UTC m=+5493.300946481" Feb 02 12:03:35 crc kubenswrapper[4845]: I0202 12:03:35.309243 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:35 crc kubenswrapper[4845]: I0202 12:03:35.309949 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:35 crc kubenswrapper[4845]: I0202 12:03:35.366209 4845 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:36 crc kubenswrapper[4845]: I0202 12:03:36.286373 4845 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:36 crc kubenswrapper[4845]: I0202 12:03:36.349875 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.250421 4845 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wxhjh" podUID="0cce6e08-6894-471c-8db2-1846df2e73bb" containerName="registry-server" containerID="cri-o://d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242" gracePeriod=2 Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.800941 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.818261 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content\") pod \"0cce6e08-6894-471c-8db2-1846df2e73bb\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.818339 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities\") pod \"0cce6e08-6894-471c-8db2-1846df2e73bb\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.818407 4845 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6qf8\" (UniqueName: \"kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8\") pod \"0cce6e08-6894-471c-8db2-1846df2e73bb\" (UID: \"0cce6e08-6894-471c-8db2-1846df2e73bb\") " Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.819910 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities" (OuterVolumeSpecName: "utilities") pod "0cce6e08-6894-471c-8db2-1846df2e73bb" (UID: "0cce6e08-6894-471c-8db2-1846df2e73bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.841355 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8" (OuterVolumeSpecName: "kube-api-access-q6qf8") pod "0cce6e08-6894-471c-8db2-1846df2e73bb" (UID: "0cce6e08-6894-471c-8db2-1846df2e73bb"). InnerVolumeSpecName "kube-api-access-q6qf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.909863 4845 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cce6e08-6894-471c-8db2-1846df2e73bb" (UID: "0cce6e08-6894-471c-8db2-1846df2e73bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.921418 4845 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.921471 4845 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6qf8\" (UniqueName: \"kubernetes.io/projected/0cce6e08-6894-471c-8db2-1846df2e73bb-kube-api-access-q6qf8\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:38 crc kubenswrapper[4845]: I0202 12:03:38.921483 4845 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cce6e08-6894-471c-8db2-1846df2e73bb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.262696 4845 generic.go:334] "Generic (PLEG): container finished" podID="0cce6e08-6894-471c-8db2-1846df2e73bb" containerID="d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242" exitCode=0 Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.262762 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerDied","Data":"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242"} Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.262796 4845 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wxhjh" event={"ID":"0cce6e08-6894-471c-8db2-1846df2e73bb","Type":"ContainerDied","Data":"58b4b61f11c41e7a9802231b94ce0f3773ae77afd46dde6a7945c82fbc472073"} Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.262817 4845 scope.go:117] "RemoveContainer" containerID="d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.263109 4845 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wxhjh" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.286148 4845 scope.go:117] "RemoveContainer" containerID="84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.307568 4845 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.324750 4845 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wxhjh"] Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.324845 4845 scope.go:117] "RemoveContainer" containerID="9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.364134 4845 scope.go:117] "RemoveContainer" containerID="d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242" Feb 02 12:03:39 crc kubenswrapper[4845]: E0202 12:03:39.364592 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242\": container with ID starting with d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242 not found: ID does not exist" containerID="d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.364625 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242"} err="failed to get container status \"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242\": rpc error: code = NotFound desc = could not find container \"d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242\": container with ID starting with d3ec9a9ced8824a1e5528138ddad98c41c5eb624391eeb46586edcaaca306242 not found: ID does not exist" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.364648 4845 scope.go:117] "RemoveContainer" containerID="84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89" Feb 02 12:03:39 crc kubenswrapper[4845]: E0202 12:03:39.365105 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89\": container with ID starting with 84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89 not found: ID does not exist" containerID="84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.365152 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89"} err="failed to get container status \"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89\": rpc error: code = NotFound desc = could not find container \"84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89\": container with ID starting with 84fae2069a8b6dc0ad3bdf85e6f70858e60543e9ac9625a268afdf26aebdad89 not found: ID does not exist" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.365178 4845 scope.go:117] "RemoveContainer" containerID="9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c" Feb 02 12:03:39 crc kubenswrapper[4845]: E0202 12:03:39.365585 4845 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c\": container with ID starting with 9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c not found: ID does not exist" containerID="9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.365613 4845 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c"} err="failed to get container status \"9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c\": rpc error: code = NotFound desc = could not find container \"9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c\": container with ID starting with 9dc80a0800420b098a6aaded96b161dd9ad1323a5c391d8608dfbbc546392a2c not found: ID does not exist" Feb 02 12:03:39 crc kubenswrapper[4845]: I0202 12:03:39.726264 4845 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cce6e08-6894-471c-8db2-1846df2e73bb" path="/var/lib/kubelet/pods/0cce6e08-6894-471c-8db2-1846df2e73bb/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515140111251024434 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015140111251017351 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015140076125016506 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015140076126015457 5ustar corecore